All of vroman's Comments + Replies

I read and understood the Least convenient possible world post. given that, then let me rephrase your scenario slightly

If every winner of a certain lottery receives $X * 300 million, a ticket costs $X, the chances of winning are 1 in 250 million, you can only buy one ticket, and $X represents an amount of money you would be uncomfortable to lose, would you buy that ticket?

answer no. If the ticket price crosses a certain threshold, then I become risk averse. if it were $1 or some other relatively inconsequential amount of money, then I would be rationally compelled to buy the nearly-sure loss ticket.

0SecondWind
If you'd be rationally compelled to buy one low-cost ticket, then after you've bought the ticket you should be rationally compelled to buy a ticket. And then rationally compelled to buy a ticket. Sure, at each step you're approaching the possibility with one fewer dollar, but by your phrasing, the number of dollars you have does not influence your decision to buy a ticket (unless you're broke enough that $1 is not longer a relatively inconsequential amount of money). This method seems to require an injunction against iteration.

*kill traveler to save patients problem

assuming that

-the above solutions (patient roulette) were not viable

-upon recieving their new organs, the patients would be restored to full functionality, the equal of or better utility generators than the traveler

then I would kill the traveler. however, if the traveler successfully defended himself, and turned the tables on me, I would use my dying breath to happily congratulate his self preservation instinct and wish him no further problems on the remainder of his journey. and of course Id have left instructions w ... (read more)

@ doug S

I defeat your version of the PW by asserting there is no rational lottery operator who goes forth with the business plan to straight up lose $50million. thus the probability of your scenario, as w the christian god, is zero.

humanity is doomed in this scenario. the Lotuseaters are smarter and the gap is widening. Theres no chance humans can militarily defeat them now or any point in the future. as galactic colonization continues exponentially, eventually they will meet again, perhaps in the far future. but the Lotusfolk will be even stronger relatively at that point. the only way humans can compete is developing an even faster strong-AI, which carries large chance of ending humanity on its own.
so the choices are:
-accept Lotusfolk offer now
-blow up the starline, continue exp... (read more)