Let p_n be the probability that I say n. Then the probability I escape on exactly the nth round is at most p_n/2 since the coin has to come up on the correct side, and then I have to say n. In fact the probability is normally less than that since there is a possibility that I have already escaped. So the probability I escape is at most the sum over n of p_n/2. Since p_n is a probability distribution it sums to 1, so this if at most 1/2. I'll escape with probability less than this is I have any two p_n nonzero. So the optimal strategies are precisely to always say the same number, and this can be any number.

I got the same answer, with essentially the same reasoning.

Assuming that each guess is a draw from the same probability distribution over positive integers, the expected number of correct guesses is 0.5 if I keep guessing forever (rather than leaving after 1 correct guess), regardless of what distribution I choose.

So the probability of getting at least one correct guess (which is the win condition) is capped at 0.5. And the only way to hit that maximum is by removing all the scenarios where I guess correctly more than once, so that all of the expected value comes from the scenarios where I guess correctly exactly once.

I'm looking for an anecdote about sunk costs. Two executives were discussing some bad business situation, one of them asks "look, suppose the board were to fire us and bring new execs in. What would those guys do?" "Get us out of the X business" "Then what's to stop us from leaving the room, coming back in, and doing exactly that?"

...but all my google-fu can't turn up the original source. Does it sound familiar to anyone here?

Grove says he and Moore were in his cubicle, "sitting around ... looking out the window, very sad." Then Grove asked Moore a question.

"What would happen if somebody took us over, got rid of us — what would the new guy do?" he said.

"Get out of the memory business," Moore answered.

Grove agreed. And he suggested that they be the ones to get Intel out of the memory business.

Three Amazons are the right answer. AFAIK, the biggest river there is approximately as large, as the biggest river on the island of Crete. Which may be beautiful, but quite lousy in cubic meters per second.

Where and how some people see three Amazons on Antarctica, is a mystery to me. The amount of ice falling directly into the sea, is quite pathetic, as well.

But mostly, I love how the arithmetic is reigning supreme above all the sciences.

Wikipedia is another nice source of info. It claims that, during the past 20,000 years, the fastest increase in sea level was around 5 meters per century.

(The page on sea level rise mentions 3 meltwater pulses; clicking through it looks like Meltwater Pulse 1A is the one that researchers are the most confident about.)

This is a mean vs median or Mediocristan vs Extremistan issue. Most people cannot do lone wolf, but if you can do lone wolf, you will probably be much more successful than the average person.

Think of it like this. Say you wanted to become a great writer. You could go to university and plod through a major in English literature. That will reliably give you a middling good skill at writing. Or you could drop out and spend all your time reading sci-fi novels, watching anime, and writing fan fiction. Now most people who do that will end up terrible writers. But when someone like Eliezer does it, the results are spectacular.

Furthermore, because of the Power Law and the "Average is Over" idea, most of the impact will come from the standout successes.

Taking classes is a relatively Mediocristan-style way to work with others, but there are other ways that get you Extremistan-style upside.

One way is to find a close collaborator or two. Amos Tversky and Daniel Kahneman had an extremely close collaboration, doing most of their thinking in conversation as they were developing the field of heuristics and biases research (as described in *The Undoing Project*). It's standard startup advice to have more than one founder so that you'll have someone "to brainstorm with, to talk you out of stupid decisions, and to cheer you up when things go wrong." Etc.

Another way is to have a group of several people who are all heavily into the same thing. If the documentary *Dogtown and Z-Boys* is accurate, many innovations in skateboarding came from a group of teenage skateboarders who hung out together in Southern California in the 1970s. People who are trying to understand the state of the art in an intellectual field often put together a reading group to discuss the latest essays in that field and spin off their own ideas (e.g., John Stuart Mill talks about studying political economy and syllogistic logic in this way, which led to new ideas & publications). Etc.

As I said, I want a richer way to talk about probabilities, more complex than taking them as simple scalars. Do you think it's a bad idea?

That's right, I think it's a bad idea: it sounds like what you actually want is a richer way to talk about your beliefs about Coin 2, but you can do that using *standard* probability theory, without needing to invent a new field of math from scratch.

Suppose you think Coin 2 is biased and lands heads some unknown fraction *r* of the time. Your uncertainty about the parameter *r* will be represented by a probability distribution: say it's normally distributed with a mean of 0.5 and a standard deviation of 0.1. The point is, the probability of *r* having a particular value is a *different question* from the the probability of getting heads on your first toss of Coin 2, which is still 0.5. You'd have to ask a different question than "What is the probability of heads on the first flip?" if you want the answer to distinguish the two coins. For example, the probability of getting exactly *k* heads in *n* flips is C(*n*, *k*)(0.5)^*k*(0.5)^(*n*−*k*) for Coin 1, but (I think?) ∫₀¹ (1/√(0.02π))*e*^−((*p*−0.5)^2/0.02) C(*n*, *k*)(*p*)^*k*(*p*)^(*n*−*k*) *dp* for Coin 2.

Does St.Bayes frown upon it?

St. Cox probably does.

Suppose you think Coin 2 is biased and lands heads some unknown fraction r of the time. Your uncertainty about the parameter r will be represented by a probability distribution: say it's normally distributed with a mean of 0.5 and a standard deviation of 0.1. The point is, the probability of r having a particular value is a different question from the the probability of getting heads on your first toss of Coin 2, which is still 0.5.

A standard approach is to use the beta distribution to represent your uncertainty over the value of r.

I'd like to see some changes to the CFAR-related questions; I've sent a PM with details.

On #2, I've seen it claimed -- but have no idea how good the science behind it is -- that better than visualizing positive or negative outcomes alone is doing *both* and paying attention to the contrast. "If I do X, then the result will look like Y. If I don't do X, the result will look like Z. Wow, Y is much better than Z: better get on with doing X".

The keyword for that research is *mental contrasting*. It was previously discussed on LW here.

My impression is that the quality of the science is relatively good, compared to other psychology research that was done in 2000-2012. But as far as I know it has not yet been tested with the improved research methods that have come out of the replication crisis (e.g., I don't know of any large sample size, preregistered studies of mental contrasting).

Double Crux was largely a re-invention of Street Epistemology

You can find people to practice it with at the Street Epistemology Facebook group. They're having role-play sessions, making how-to videos, etc.

streetepistemology.com

(This is Dan from CFAR)

This is the first I've heard of Street Epistemology, or Boghossian's book *A Manual for Creating Atheists* where it was apparently introduced. A key difference between it and Double Crux:

From their guide, it looks like Street Epistemology is intended to be an asymmetric game. Only player A knows about Street Epistemology, player A chooses to start the conversation about a topic where player A is confident that they are right and player B is wrong, and the conversation is about the reasons for player B's beliefs. Player A attempts to change player B's mind by improving player B's epistemology. Player A needn't talk about their own beliefs; there is a short subsection in the guide which addresses this topic, beginning "If asked about your own beliefs you should be prepared to answer." The guide describes Street Epistemology as being "most useful for extraordinary claims, such as miracles and supernatural phenomena."

Double Crux is intended to be a symmetric game, where both players know what kind of conversation they're getting into and both players put their beliefs (and the reasons for their beliefs) on the table in an attempt to improve their models. The object of the game (as its name suggests) is to find a crux that is shared by both players, where either of them would change their mind about the original disagreement if they changed their mind about the crucial point. I previously described Double Crux as being most useful for tricky, important-to-you questions where "digging into your own thinking and the other person's thinking on the topic is one of the more promising options available for making progress towards figuring out something that you care about."

Do you know how many people who participate in the CFAR focusing workshop got Focusing enough to fell a felt shift?

About 60%.

More specifically: At the February workshop, 65% of participants filled out the optional data collection handout at the end of the hour-long Focusing class. Of the participants who filled it out, 60% circled 6 or higher in response to the question *Did you experience a "felt shift"?* (0 = not at all, 10 = yes, definitely).

(This is Dan from CFAR.)

View more: Next

Does anyone have any tips or strategies for making better social skills habitual? I'm trying to be more friendly, compliment people, avoid outright criticism, and talk more about other people than myself. I can do these things for a while, but I don't feel them becoming habitual as I would like. Being friendly to people I do not know well is particularly hard, when I'm tired I want to escape interaction with everyone except close friends and family.

This is not an easy-to-implement tip, but my suggestion is to try to get into a mental space where the social things that you're trying to do are easy / come naturally / are the things that you want to do in the moment.

A person who is naturally friendly, non-critical, and interesting in hearing about you probably did not get that way just by practicing each of those behaviors as habits; they have some deeper motivation/perspective/emotion/something that those behaviors naturally follow from. Try to get in touch with that deeper thing.

One thing that helps with this is noticing when you've had the experience of being in a mental space where the things come more naturally (even if only briefly, or only marginally more naturally). Then you can try to get back into that mental space, and take it further.

Another thing that can help is putting yourself in different social situations, including ones that you're liable to get swept up in (that is, ones that are likely to put you in a different mental space from where you usually are). That can be a quicker way to get some experience being in different modes. Reading books (and watching videos, etc.) can also help, especially if you do things like these as you read them.