this post was submitted on 15 Mar 2026
37 points (97.4% liked)

Asklemmy

53579 readers
705 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

Newcomb's problem is a thought experiment where you're presented with two boxes, and the option to take one or both. One box is transparent and always contains $1000. The second is a mystery box.

Before making the choice, a supercomputer (or team of psychologists, etc) predicted whether you would take one box or both. If it predicted you would take both, the mystery box is empty. If it predicted you'd take just the mystery box, then it contains $1,000,000. The predictor rarely makes mistakes.

This problem tends to split people 50-50 with each side thinking the answer is obvious.

An argument for two-boxing is that, once the prediction has been made, your choice no longer influences the outcome. The mystery box already has whatever it has, so there's no reason to leave the $1000 sitting there.

An argument for one-boxing is that, statistically, one-boxers tend to walk away with more money than two-boxers. It's unlikely that the computer guessed wrong, so rather than hoping that you can be the rare case where it did, you should assume that whatever you choose is what it predicted.

top 41 comments
sorted by: hot top controversial new old
[–] Azzu@lemmy.dbzer0.com 3 points 4 hours ago

The answer depends entirely on what "rarely makes mistakes" means.

If the prediction is correct more than 50.05% of the time, then I would take the mystery box. Expected value = 0.5006 * 1,000,000 = 500,600

If the prediction is correct less than 50.05% of the time, then I would take both: expected value = 1000 + (1 - 0.5004) * 1,000,000 = 500,600

Since "rarely" usually means some value much less than 50%, I would definitely take the mystery box.

[–] Arrkk@lemmy.world 2 points 6 hours ago

An angle I don't see people looking at is to reframe the problem with amounts that are much more understandable, there is one thousand times more money in the mystery box, so let's do the following:

The Open box has 1 cent in it, and the mystery box might have $10, what do you do?

Y'all are telling me you'd rather take a penny and have a tiny Chance at $10, rather than taking $10 with a tiny Chance of getting zero?

[–] arthur@lemmy.zip 3 points 7 hours ago (1 children)

As the values are already settled, I take both boxes.

[–] AdmiralSnackbar@sopuli.xyz 1 points 7 hours ago* (last edited 7 hours ago)

This. I’ll take a guaranteed $1,000 with a chance at a million every single time.

[–] psoul@lemmy.world 9 points 14 hours ago (1 children)

It means that the people in the experiment have $1,001,000 to give way, for free.

What if I rob them first?

What if I convince them to unionize and they redistribute all the money fairly among the workers and force management to not conduct shitty social experiments on people?

[–] Hadriscus@jlai.lu 2 points 8 hours ago

this guy thinks outside the box

[–] davel@lemmy.ml 11 points 18 hours ago (1 children)

Mmmm, this sounds like an idealist hypothetical problem that in reality can’t exist, so to engage with it is to engage with nonsense.

The predictor rarely makes mistakes because… just because. It’s axiomatic. The predictor runs on the magic of unsupported assertion.

[–] Objection@lemmy.ml 3 points 18 hours ago (1 children)

Some version of it could exist. Not with the big numbers and not with the high degree of certainty in the problem, but you could have, say, somebody who's on average 70% accurate at reading people and the boxes are $1 and $10.

It is somewhat idealist in that it's a contrived scenario, but it's really just idle curiosity on my part. Maybe it could reflect something about people's thought processes, or maybe it's just people interpreting the question differently.

[–] davel@lemmy.ml 2 points 18 hours ago (1 children)

Even if it were to exist in the short run, it wouldn’t be stable. The predictor must be predicting somehow, which eventually could be at least partially sussed out, and future decisions would change as a result. Unless the predictor runs on literal magic, it would eventually no longer fit its own definition.

[–] Arrkk@lemmy.world 3 points 6 hours ago

You can flip the problem around and have it be mathematically the same. The predictor has some knowable accuracy, you can run the experiment many times to determine what it is. Let's also replace the predictor with an Oracle, guaranteed 100% always correct, and we'll manually impose some error by doing the opposite of its prediction with some probability. This is fully indistinguishable from our original predictor.

Now, instead of the predictor making a prediction, let's choose our box first, then decide what to put in the mystery box afterwards, with some probability of being "wrong" (not putting the money in for the 1 box taker, or putting the money in for the 2 box taker). This is identical to having an Oracle, we know exactly what boxes will be taken, but there is some error in the system.

Now we ask, should you take one box or two? Obviously it depends on what the probability is. There's no more "fooling" the predictor. So, you do the EV calculation and find that if the probability is more than 50% accurate (in other words, if the probability of error is less than 50%), you should always take 1 box

[–] chicken@lemmy.dbzer0.com 5 points 16 hours ago

A rule of thumb I think is good for most sorts of investment is, what choice can you feel good about making whether or not it works out? I can handle not getting 1k, but I would feel like a real chump missing out on an easy 1m without giving my best effort. If I pick just the mystery box and win, I feel like that win is deserved. If I pick just the mystery box and I walk away with nothing, then at least I don't have to live with the shame of being a 2-boxer, which is more valuable than $1k. If I pick both boxes, I most likely get a little bit of money and a lifetime of bitter regrets, or in the less likely case get 1.001 million dollars and a sense of having barely avoided disaster and not really "deserving" it. Choosing only the mystery box is the clear choice because it is the choice I am more able to handle having made, on an emotional level.

[–] ryrybang@lemmy.world 18 points 21 hours ago* (last edited 21 hours ago)

Fuck those boxes and the game. Steal the computer. Any computer that can predict individual human behavior with 99% accuracy would be worth billions. If such a thing existed and could be controlled, it'd be a total waste to have it running grad school human lab experiments. That's actual god-tier power.

[–] AstroLightz@lemmy.world 3 points 15 hours ago

I'll take the guanteed $1000 and not the mystery box so the prediction is always wrong :)

[–] Sickos@hexbear.net 3 points 16 hours ago (1 children)

I am, admittedly {confused by the premise|too hypothetical to warrant reasoning about}, but, I am interested in how there is ever a possible downside to taking both?

It's {$1000|one box}, {$x|one box}, {$1000+$x|two boxen}. $1000+$x > $1000 because the hungry alligator eats the bigger number

[–] Ryanmiller70@lemmy.zip 2 points 15 hours ago (1 children)

That's what has me confused. I thought I was misreading something cause I couldn't see a downside to not taking both boxes. If the box is empty, you still have the $1000 and if it's not then you get even more money.

[–] Objection@lemmy.ml 1 points 9 hours ago

The point is to reveal the different frames of analysis people use to make the decision.

This thought process, "The decision's already been made, either way it's always a free $1000," is one way of looking at it. But another way of looking at it is, "Those who choose one box tend to walk away with more money, so the evidence shows that taking one box is the better approach." These approaches sort of "talk past each other," because they're looking at completely different parts of the problem in order to draw their conclusions, and those different parts indicate very opposing conclusions.

[–] fizzle@quokk.au 4 points 18 hours ago

It obviously depends on the computers mysterious ability to predict what I'm going to do.

once the prediction has been made, your choice no longer influences the outcome.

This statement doesn't make sense. The computer would predict that you would think that.

Take out my gun, pistol whip the researcher, and steal the $1,000,000.

[–] ada@lemmy.blahaj.zone 10 points 22 hours ago

Assuming I knew that my behaviour was being modelled and this model would influence the outcome, I'd remove myself from the decision making process and flip a coin.

[–] felsiq@piefed.zip 10 points 22 hours ago

I think the numbers are a little off for this to be tempting, if I’m getting $1,000,000 then a K is a rounding error and I see no reason to make the mil any less likely for it. Like if I wanted that extra grand throwing 10% of the mil into a short GIC would be how I’d get it personally, for a risk free $1,001,000

[–] searabbit@piefed.social 5 points 21 hours ago

This feels like the poison scene from the princess bride, so I'll approach it with that level of intellectual derangement.

Which means the obvious first step is to recognize that the house is a cheater who wants you to stay poor so your choice doesn't matter. There is poison in both cups and I will lose either way. Money no longer influences my decision.

Next, I flip a coin ten times and note my reaction to the choices. That's my gut instinct and obviously what the model predicted unless it's either not smart enough to know my gut or smart enough to predict my double bluff, therefore useless.

Next, I decide which variables are most likely to influence the prediction (gender, age, education level, big 5 personality score) and realize this is the adult marshmallow test. I obviously think I'm smart and want the model to know that, so it obviously predicted that I would take one box because I'm a good little goodie two shoes who delays instant gratification for the potential bigger payoff. Therefore I choose two boxes because the model would never expect someone as smart as I to make such a dumb greedy move. Surely, I have outsmarted the supercomputer with my quadruple bluff and have won.

And then I remember I am dumb and the model knows that, because in my excitement, I forgot that the house is a cheater who always wins (and there was likely never any money in the mystery box because researchers never get that kind of funding). I am forced to believe that the model accurately perceived me to be a greedy idiot who took two boxes against my better judgement, shattering my ego.

But hey, I at least got $1k out of it.

[–] bravesentry@feddit.org 7 points 22 hours ago

two-boxers are why we cant have nice things. and they themselves cant either.

I already opened the mystery box before you finished explaining.

[–] red_tomato@lemmy.world 6 points 22 hours ago

Just the mystery box. If the computer rarely guesses wrong, then I’m $1,000,000 richer.

[–] DagwoodIII@piefed.social 6 points 23 hours ago (1 children)

I'm playing with the house's money.

If I get nothing, I'm no worse off than I was before.

Besides, the mystery box is a mystery, and I love a mystery.

[–] Nemo@slrpnk.net 3 points 23 hours ago

my favorite flavor of Dum-Dum

[–] owenfromcanada@lemmy.ca 5 points 22 hours ago

I'm the kind of person who would ask for their definition of "rarely". How many 9s are we talking? If it's at least three nines, I'm one-boxing it.

[–] Keshara@piefed.world 5 points 23 hours ago* (last edited 23 hours ago)

Link to the Veritasium video if anyone is wondering:

https://www.youtube.com/watch?v=Ol18JoeXlVI

I'm on the 'just grab the mystery box' side πŸ™‚

[–] Collatz_problem@hexbear.net 2 points 20 hours ago

Do a coin toss to choose.

[–] taiyang@lemmy.world 1 points 18 hours ago

So, if I'm understanding right, the computer said there's a great chance at $1 million in the mystery box if I'm the type to pick it and only it, which I am.

But I pick it for this bad reason: friends and I like to make fun of my troll luck. I'd take it, and somehow get $0 despite several more people after me will get $1 million after me, including two boxers. We'd laugh at "can you believe this shit?" and know it just wasn't in the stars, just like that time I broke my ankle and all the automated walkways at the airport were broken. It's that kind of thing (and yes that happened). My infamous troll luck strikes again!

Or it's all superstition and I get a million bucks, since I'm pretty sure my infamous luck is just confirmation bias. Either way, fun had by all.

[–] Hazy@aussie.zone 3 points 23 hours ago
[–] Dyskolos@lemmy.zip 2 points 21 hours ago

I don't need the 1000, so the mio-gamble would be my choice.

If I'd really need 1k right now, those would be the choice. Gambling is stupid.

[–] thesohoriots@lemmy.world 2 points 21 hours ago

β€œYou took the box! Let's see what's in the box! Nothing! Absolutely nothing! STUPID! You so STU-PIIIIIIIIIIID!”

[–] ryven@lemmy.dbzer0.com 1 points 18 hours ago

I don't need $1M. I'll take just the $1000, if the other box has a million in it they can put it towards the next experiment.

[–] Jerb322@lemmy.world 1 points 19 hours ago

Weird, I just saw a video about this. I'm a second box guy. Feel like talking both boxes is trying to "cheat" or "trick" the computer. Or maybe a greed thing. I'm not really sure. But I'd probably go for just the second box.

[–] monovergent@lemmy.ml 1 points 19 hours ago* (last edited 19 hours ago)

One box. I might be unlucky and lose out on $1000 in that other box, but I wouldn't be too bothered. On the other hand, if I were to grab both and get $1000, the thought of what if I took just one box and got a million dollars would gnaw at me for the rest of my life.

The decision changes dramatically if the box with less money were closer to a million though.

[–] RobotToaster@mander.xyz 2 points 22 hours ago

I only know about this because of zero escape lol.

One box, I'm not messing with anything powerful enough to predict the future.

[–] kibiz0r@midwest.social 1 points 20 hours ago
[–] z3rOR0ne@lemmy.ml 0 points 16 hours ago

I'd just walk away entirely.

I'll take the safe grand and not bother with gambling.

[–] tomi000@lemmy.world 1 points 23 hours ago

When I watched the video I immediately knew I was a 2-boxer. "Its just 1000$ more, the mystery box doesnt magically change". Of course I know it is better to be the kind of person to only take the mystery box, because for me it will be empty. But saying "I would take only one" feels like cheating, like the mystery box would still be empty because the supercomputer knew I was only pretending.