solipsist

Wiki Contributions

Comments

Sorted by

Which (possibly all) of the VNM axioms do you think are not appropriate as part of a formulation of rational behavior?

I think the Peano natural numbers is a reasonable model for the number of steins I own (with the possible exception that if my steins fill up the universe a successor number of steins might not exist). But I don't think the Peano axioms are a good model for how much beer I drink. It is not the case that all quantities of beer can be expressed as successors to 0 beer, so beer does not follow the axiom of induction.

I think ZFC axioms are a poor model of impressionist paintings. For example, it is not the case that for every impressionist paintings x and y, there exists an impressionist painting that contains both x and y. Therefore impressionist paintings violate the axiom of pairing.

Can you by chance pin down your disagreement to a particular axiom? You're modus tollensing where I expected you would modus ponens.

I didn't follow everything, but does this attempt to address self-fulfilling prophecies? Assume the oracle has good track record and releases its information publicly. If I ask it "What are the chances Russia and the US will engage in nuclear war in the next 6 months?", answers of "0.001" and "0.8" are probably both accurate.

What what sorts of output strings are you missing?

Calculating Kolmogorov complexities is hard because because it is hard differentiate between programs that run a long time and halt and programs that run a long time and never halt.

If God gave you a 1.01 MB text file and told you "This program computes BB(1000000)", then you could easily write a program to find the Kolmogorov complexity of any string less then 1 MB.

kolmogorov_map = defaultdict(lambda x : infinity)

for all strings *p* less than 1000000:

    run *p* for at most BB(1000000) steps

    save output to *o*

    if (*p* halted and kolmogorov_map[*o*] > len(p):

        kolmogorov_map[*o*] = len(p) # found smaller program

    else:
        # *p* does not ever ever halt and has no output

Replace BB(1000000) with a smaller number, say A(Graham's number, Graham's number), and this calculator works for all programs which halt in less than A(Graham's number, Graham's number) steps. That includes pretty much every program I care about! For instance, it includes every program which could run under known physics in the age of the universe.

Eh, don't take it personally. I'm guessing commenters are implicitly taking the title question as a challenge and are pouncing to poke holes in your argument. I thought your essay was well written and thought provoking. Keep posting!

Don't know, not the original author. What do you think the chances are than an email on the third page of your inbox will ever get a reply? Inbox purgatory seems to me like a way to give up on something without having to admit it yourself.

If my inbox has more than 40 or 50 items in it I feel demoralized and find it harder to work through newer items, so the easiest way for me to stay at steady-state is to keep my inbox at zero or close to it.

Counterpoint: I've kept to an empty inbox for many years, but know people with ever-growing inboxes whom I consider more organized and responsive. I've never declared email bankruptcy during my professional life and don't know the consequences.

And nothing in here says anything about how to deal with that situation.

I read the advice as:

If you still have unresolved emails from 2015 in your inbox then keeping emails in your inbox isn't causing them to get resolved. Accept that, get a clean slate, and move on.

Make a folder called "old inbox" and put all your old emails there. Now you have an empty inbox! The costs of putting your old emails out of sight are less than the benefits of keeping an empty inbox going forward.

HLS students of any skin color have high IQs as measured by standardized tests. The school's 25th percentile LSAT score is 170, which is 97.5th percentile for the subset of college graduates who take the LSAT. 44% of HLS students are people of color.

The book to read is Reasons and Persons by Derek Parfit.

If love your simulation as you love yourself, they will love you as they love themselves (and if you don't, they won't). You can choose to have enemies or allies with your own actions.

You and a thousand simulations of you play a game where pressing a button gives the presser $500 but takes $1 from each of the other players. Do you press the button?

Load More