My usual attitude is that our brains are not unified coherent structures, our minds still less so, and that just because I want X doesn't mean I don't also want Y where Y is incompatible with X.
So the search for some single thing in my brain that I can maximize in order to obtain full satisfaction of everything I want is basically doomed to failure, and the search for something analogous in my mind still more so, and the idea that the former might also be the latter strikes me as pure fantasy.
So I approach these sorts of thought experiments from two different perspectives. The first is "do I live in a world where this is possible?" to which my answer is "probably not." The second is "supposing I'm wrong, and this is possible, is it good?"
That's harder to answer, but if I take seriously the idea that everything I value turns out to be entirely about states of my brain that can be jointly maximized via good enough wireheading, then sure, in that world good enough wireheading is a fine thing and I endorse it.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Okay, I have a "stupid" question. Why is the longer binary sequence that represents the hypothesis less likely to be 'true' data generator? I read the part below but I don't get the example, can someone explain in a different way?