If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
A short story - titled "The end of meaning"
It is propaganda for my improving autonomy work. Not sure it is actually useful in that regard. But it was fun to write and other people here might get a kick out of it.
Tamara blinked her eyes open. The fact she could blink, had eyes and was not in eternal torment filled her with elation. They'd done it! Against all the odds, the singularity had gone well. They'd defeated death, suffering, pain and torment with a single stroke. It was the starting of a new age for mankind, one not ruled by a cruel nature but by a benevolent AI.
Tamara was a bit giddy about the possibilities. She could go paragliding in Jupiter clouds, see super nova explode and finally finish reading infinite jest. But what should she do first? Being a good rationalist Tamara decided to look at the expected utility of each action. No possible action she could take would reduce the suffering of anyone or increase their happiness, because by definition the AI would be maximising those anyway with its super intelligence and human aligned utility maximisation. She must look inside herself for which actions to take.
She had long been a believer in self-perfection and self-improvement. There were many different ways that she might self-improve, would she improve her piano, become an astronomy expert or plumb the depths of understanding her brain so that she could choose to safely improve her inner algorithms. Try as she might she couldn't make a decision between these options. Any of these changes to herself looked as valuable as any other. None of them would improve her lot in life. She should let the AI decide what she should experience to maximise her eudaimonia.
blip
Tamara struggled awake. That was some nightmare she had had about the singularity. Luckily it hadn't occurred yet, she could still fix it and make the most meaningful contribution to the human race's history by stopping death, suffering and pain.
As she went about her day's business solving decision theory problems she was niggled by a possibility. What if the singularity has already happened and she was just in a simulation. It would make sense that the greatest feeling for people would be to solve the worlds greatest problems. If the AI was trying to maximise Tamara's utility, ver might put her in a situation where she could be the most agenty and useful. Which would be just before the singularity. There would have to be enough pain and suffering within the world to motivate Tamara to fix it, and enough in her life to make it feel consistent. If so none of her actions here are meaningful, she is not actually saving humanity.
She should probably continue to try and save humanity, because of indexical uncertainty.
Although if she had this thought her life would be plagued by doubts about whether her life is meaningful or not, so she is probably not in a simulation as her utility is not being maximised. Probably...
Another thought gripped her, what if she couldn't solve the meaningfulness problem from her nightmare? She would be trapped in a loop.
blip
A nightmare within a nightmare, that is the first time this had happened to Tamara for a very long time. Luckily she had solved the meaningfulness problem a long time ago else the thoughts and worries would plague her. We just need to keep humans as capable agents and work on intelligence augmentation. It might seem like a longer shot than a singleton AI requiring people to work together to build a better world, but humans would have a meaningful existence. They would able to solve their own problems, make their own decisions about what to do based upon their goals and also help other people, they would still be agents of their own destiny.
Serves her right for making self-improvement a foremost terminal value even when she knows that's going to be rendered irrelevant, meanwhile the loop I'm stuck in is of the first six hours spent in my catgirl volcano lair.