Sounds pretty cool, definitely going to try it out some.
Oh, and by the way, you wrote "Inpsect" instead of "Inspect" at the end of page 27.
Sounds pretty cool, definitely going to try it out some.
Oh, and by the way, you wrote "Inpsect" instead of "Inspect" at the end of page 27.
Working links on yudkowsky.net and acceleratingfuture.com:
Transhumanism as Simplified Humanism The Meaning That Immortality Gives to Life
There are no "correct" or "incorrect" definitions, though, are there?
Well... Definitions that map badly onto the underlying reality are inconvenient at best and actively misleading at worst.
Besides, definitions do not exist in a vacuum. They can be evaluated by their fitness to a purpose which means that if you specify a context you can speak of correct and incorrect definitions.
That's true, though I think "optimal" would be a better word for that than "correct".
Maybe my definition of "supernatural" isn't the correct definition, but I often think of the word as describing certain things which we do not (currently) understand. And if we do eventually come to understand them, then we will need to augment our understanding of the natural laws...Assuming this "supernatural" stuff actually exists.
I suppose a programer could defy the laws he made for his virtual world when he intervenes from outside the system....But earthly programers obey the natural physical laws when they mess with the hardware, which also runs based on these same laws. I understand this is what you mean by "constrained by natural laws".
There are no "correct" or "incorrect" definitions, though, are there? Definitions are subjective, it's only important that participants of a discussion can agree on one.
I took it. I was surprised how far I was off with Europe.
I know this is over a year old, but I still feel like this is worth pointing out:
If you can get the positive likelihood ratio as the meaning of a positive result, then you can use the negative likelihood ratio as the meaning of the negative result just reworking the problem.
You weren't using the likelihood ratio, which is one value, 8.33... in this case. You were using the numbers you use to get the likelihood ratio.
But the same likelihood ratio would also occur if you had 8% and 0.96%, and then the "negative likelihood ratio" would be about 0.93 instead of 0.22.
You simply need three numbers. Two won't suffice.
Why wait until someone wants the money? Shouldn't the AI try to send 5 Dollars to everyone with a note attached reading "Here is a tribute; please don't kill a huge number of people" regardless of whether they ask for it or not?