Posts

Sorted by New

Wiki Contributions

Comments

I guess I didn't make myself at all clear on that point, I ascribe to both of the above!

Another way to avoid the paradox is to care about other people's satisfaction (more complicated than that, but that's not the point) from their point of view, which encompasses their frame of reference.

Another way perhaps is to restate implementing improvements as soon as possible as maximizing total goodness in (the future of) the universe. Particularly, if an improvement could only be implemented once, but it would be twice as effective tomorrow instead of today, do it tomorrow.

The probability distribution part is better, though I still don't see how software that uses randomness doesn't fall under that (likewise: compression, image recognition, signal processing, and decision making algorithms).

Any software that uses randomness requires you to meet a probability distribution over its inputs, namely that the random input needs to be random. I assume that you're not claiming that this breaks modularity, as you advocate the use of randomness in algorithms. Why?

(idle bemusement)

Does an optimal superintelligence regret? They know they couldn't have made a better choice given its past information about the environment. How is regret useful in that case?

So you're differentiating between properties where the probability of [0 1 2 3] is 1-ɛ while >3 is ɛ and probability distributions where the probability of 0 is 0.01, 1 is 0.003, etc? Got it. The only algorithms that I can think of that require the latter are those that require uniformly random input. I don't think those violate modularity though, as any are of the program that interfaces with that module must provide independently random input (which would be the straightforward way to meet that requirement with an arbitrary distribution).

There's a difference between requiring and being optimized for though, and there are lots of algorithms that are optimized for particular inputs. Sort algorithms are a excellent example, if most of your lists are almost already sorted, there algorithms that are cheaper on average, but might take a long time with a number of rare orderings.

Requiring that the inputs to a piece of software follow some probability distribution is the opposite of being modular.

What? There is very little software that doesn't require inputs to follow some probability distribution. When provided with input that doesn't match that (often very narrow) distribution programs will throw it away, give up, or have problems.

You seem to have put a lot more thought into your other points, could you expand upon this a little more?

The king was proposing that Orin bet 1kc, of which they only have 800c currently, in order to receive 20kc (which is twenty five times their net worth). The 200c debt was what Orin would be reduced to if they were wrong.

In such cases I'll say, "Oh! Interesting... how does that work exactly?" It seems to work out alright, and I would guess that other methods of asking for more information without implying that their statement is false are equally effective.

An addendum to [1], social security tax in the US is capped, with the cutoff being around $105k of individual income, so there may be a local dip there in percentage where the increasing income tax doesn't balance the 11% that goes to social security before that point.

Load More