Since this is a crazy ideas thread, I'll tag on the following thought. If you believe that in the future, if we are able to make ems, and we should include them in our moral calculus, should we also be careful not to imagine people in bad situations? Since by doing so, we may be making a very low-level simulation in our own mind of that person, that may or may not have some consciousness. If you don't believe that is the case now, how does that scale, if we start augmenting our minds with ever-more-powerful computer interfaces. Is there ever a point where it becomes immoral just to think of something?
Is there ever a point where it becomes immoral just to think of something?
God kind of ran into the same problem. "What if The Universe? Oh, whoops, intelligent life, can't just forget about that now, can I? What a mess... I guess I better plan some amazing future utility for those poor guys to balance all that shit out... It has to be an infinite future? With their little meat bodies how is that going to work? Man, I am never going to think about things again. Hey, that's a catchy word for intelligent meat agents."
So, in short, if we ev...
This thread is intended to provide a space for 'crazy' ideas. Ideas that spontaneously come to mind (and feel great), ideas you long wanted to tell but never found the place and time for and also for ideas you think should be obvious and simple - but nobody ever mentions them.
This thread itself is such an idea. Or rather the tangent of such an idea which I post below as a seed for this thread.
Rules for this thread:
If this should become a regular thread I suggest the following :