I'm sure this has a name, but I can't remember it. So I have given it a new name. The Mountaineer's Fallacy.

The Mountaineer's Fallacy is the suggestion that climbing Mount Everest is a good start on a trip to the moon.

In one sense, you are making progress on a key metric: distance from the moon. But in another, more accurate sense, you are wasting a lot of time and effort on something that has no chance of helping you get to the moon.

New Comment
11 comments, sorted by Click to highlight new comments since:

Mountaineer's Fallacy Fallacy - working on the real problem ineffectually when the right move is working on adjacent problems that might be pointless in order to chip away at the edges until the key components of solving the real problem are actually visible and tractable.

But yeah the OG MF is common.

Can you give an example of this happening in the real world? I don't quite see what it applies to.

Narrow AI and General AI may be the sort of thing in mind.

Has anyone pointed out to Chollet yet that scaling up bottle rockets is a great way to go to space?

[Deleted]

I love this.

In one sense, you are making progress on a key metric: distance from the moon.

This seems like it'd be less than 1%. (Or at least, 10%.)

Kind of an extreme version of getting stuck at a local maximum?

Aiming for a local maximum. (The best place to build a rocket launch probably isn't Mount Everest, though I could be wrong about that.)

Aside from the logistics issues of getting the rocket up there, the top of Everest is actually a great place to launch from. Less gravity, less air resistance, and reasonably close to the Equator (27⁰ - the same as Florida).

An idea that climbing Everest has something to do with the trip to the moon is in many ways an antithesis to the concept of mountaineering, so I don't think this name choice is fortunate.