Last year Adam Chalmers did his Philosophy honours thesis on decision theory.
Humans are bad at knowing what to do. My field looks at this problem: “say you want a thing, and there’s many ways you could try to get it. Which way to try?” My field answer this question with number-thinking.
For a long time we had a best idea. For each way, have one number for how easy it is, and a second number the chance it will give you the thing you want. Add them together many times and you get a final number for each way. The way with the biggest final number is the best way to get what you want.
But someone found a problem with this idea. This is strange but people take it seriously. What if someone pretends to be God and asks you to give her lots of hundreds of money or else she will send you to hell? You might say, “no thank you, I don’t believe you are God and don’t want to give you my money.” But the God-pretender could say “I understand why you don’t believe me. But even if there’s only a tiny chance I’m God and will send you to hell, hell goes for so long it doesn’t matter how much money I ask, you should still want to give me some to avoid it.”
If you use the numbers from the best idea, you pay the God-pretender. My work shows you shouldn’t do that, because she can come back and keep asking you for more money, or to do bad things.
This shows we need a new best idea which adds numbers a little bit different so we don’t all go around doing whatever any God-pretender says. Humans know not to do this because we don’t always do what numbers tell us to. But computers do listen to numbers, and we use computers to pick ways more. My study suggests a different way. It has its own problems but at least our computers won’t give all our money to people who pretend to be God.