Not long before the global population is tied to the track and there’s no one left to pull the switch. The trolley problem has suddenly become an inescapable extinction event.
But nobody’s there to pull the switch to run everyone over, so the train barrels on into mathematical impossibility. Does the scenario create people ex nihilo to continue on? Does the simulation crash? We won’t know until we find out.
Well that’s the thing about infinity, as n–> inf it is all more likely for someone to pull the lever than not contrary of the morality of the individual. Which concerns the dillema, do you pull the lever killing one person and ending the experiment, or do you double it, and give someone the opportunity to pull the lever and kill 2^n people. In the end it will happen assuming there is an infinite number of people having the same choice as you do.
If we all keep doubling then no one dies! let’s not think too hard on the logistics after a few doublings
Not long before the global population is tied to the track and there’s no one left to pull the switch. The trolley problem has suddenly become an inescapable extinction event.
Are we still talking about the trolley problem ?
Aren’t we always?
Let’s look at it this way. Only 34 people have to make this decision at maximum. After that everyone is laying on the track
But nobody’s there to pull the switch to run everyone over, so the train barrels on into mathematical impossibility. Does the scenario create people ex nihilo to continue on? Does the simulation crash? We won’t know until we find out.
It causes segfault and dumps core which then becomes the second Earth.
This includes OP (original puller).
This also seems like a lesson in procrastination.
So you’re saying the prime nonmover exists both at the beginning and the end?
Takes puff and squits eyes.
That’s deep bro.
I think you just invented capitalism.
Well that’s the thing about infinity, as n–> inf it is all more likely for someone to pull the lever than not contrary of the morality of the individual. Which concerns the dillema, do you pull the lever killing one person and ending the experiment, or do you double it, and give someone the opportunity to pull the lever and kill 2^n people. In the end it will happen assuming there is an infinite number of people having the same choice as you do.
And risk someone deciding they want to kill 8 billion people?