A more concrete reframing attempt
The earth is about to be hit by an asteroid that will destroy the whole planet. However, everyone will be evacuated. Humanity has built two ships, Red and Blue, to escape to a colony in Mars.
The blue ship's AI has a design flaw that will prevent the ship from leaving earth if less than 50% of people get on board, and there is no time to fix it.
Both ships have enough space for everyone in the world. Everyone can choose which ship to take to Mars, and everyone has gotten the same information in their own language. Which ship do you choose?
My reframing here is not to create a "gotcha" to the blue team, I just want to discuss how reframing the question this way impacts our decision making, since we have experimented with reframing lately. I wanted something slightly more concrete than a button for this scenario.
Do we have an ethical responsibility to choose to board the blue ship just in case there are some other people who may have chosen it in this scenario? Or does this reframing make us feel less responsible for their fate? To me it seems a bit less reasonable to blame passengers on the Red ship in this framing.
The idea is that in the default scenario, you must register as a passenger to either Blue or Red ship. Everyone chooses online. Children and others would be registered by their guardian.
We can consider it a bonus scenario, if there are *already* millions of people on the blue ship, and they cant change their choice anymore. Would it make you change your mind? How many people would make you change your mind, if any?