Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: katydee 26 May 2017 02:29:19AM 3 points [-]

With respect to power dynamics point one and two, there is another person known to the community who is perhaps more qualified and already running something which is similar in several respects - Geoff Anders of Leverage Research. So I don't think this is precisely the only group making an attempt to hit this sort of thing, though I still find it novel and interesting.

(disclaimer: I was at the test weekend for this house and am likely to participate)

Comment author: Vaniver 19 March 2017 03:13:44AM 4 points [-]

I believe this is what happened with Godric's Hollow--a four unit building turned, one by one, into a four unit rationalist building.

Comment author: katydee 24 March 2017 01:31:27AM 1 point [-]

Something like this also happened with Event Horizon, though the metamorphosis is not yet complete...

Comment author: katydee 31 January 2017 02:24:50AM 2 points [-]

Once every few days or so.

Comment author: Connor_Flexman 26 January 2017 02:49:51AM 1 point [-]

As you say, the inner circle certainly may have reason to do non-obvious things. But while withholding information from people can be occasionally politically helpful, it seems usually best for the company to have the employees on the same page and working toward a goal they see reason for. Because of this, I would usually assume that seemingly poor decisions in upper management are the result of actual incompetence or a deceitful actor in the information flow on the way down.

Comment author: katydee 26 January 2017 07:49:01AM *  1 point [-]

Broadly agreed - this is one of the main reasons I consider internal transparency to be so important in building effective organizations. in some cases, secrets must exist - but when they do, their existence should itself be common knowledge unless even that must be secret.

In other words, it is usually best to tell your teammates the true reason for something, and failing that you should ideally be able to tell them that you can't tell them. Giving fake reasons is poisonous.

Comment author: JenniferRM 25 January 2017 07:42:50AM 1 point [-]

Elon Musk is sort of obsessed with thinking about things "from first principles" rather than "by analogy". Arguably, this is a generic solution to the paradigm selection problem.

Comment author: katydee 25 January 2017 08:45:14PM 0 points [-]

In some cases it can be - and I will discuss this further in a later post. However, there are many situations where the problems you're encountering are cleanly solved by existing paradigms, and looking at things from first principles leads only to reinventing the wheel. For instance, the appropriate paradigm for running a McDonald's franchise is extremely understood, and there is little need (or room) for innovation in such a context.

Strategic Thinking: Paradigm Selection

8 katydee 24 January 2017 06:24AM

Perhaps the most important concept in strategy is the importance of operating within the right paradigm. It is extremely important to orient towards the right style or the right doctrine before you begin implementing your plan - that could be "no known doctrine, we'll have to improvise", but if it is you need to know that! If you choose the wrong basic procedure or style, you will end up refining a plan or method that ultimately can't get you to where you want to, and you will likely find it difficult to escape.

This is one of the Big Deep Concepts that seem to crop up all over the place. A few examples:

  • In software development, one form of this error is known as "premature optimization," where you focus on optimizing existing processes before you consider whether those processes are really what the final version of your system needs. If those processes end up getting cut, you've wasted a bunch of time; if you end up avoiding "wasting work" by keeping these processes, the sunk cost fallacy may have blocked you from implementing superior architecture.
  • In the military, a common mistake of this type leads to "fighting the last war" - the tendency of military planners and weapons designers to create strategies and weapon systems that would be optimal for fighting a repeat of the previous big war, only to find that paradigm shifts have rendered these methods obsolete. For instance, many tanks used early in World War II had been designed based on the trench warfare conditions of World War I and proved extremely ineffective in the more mobile style of warfare that actually developed. 
  • In competitive gaming, this explains what David Sirlin calls "scrubs" - players who play by their own made-up rules rather than the true ones, and thus find themselves unprepared to play against people without the same constraints. It isn't that the scrub is a fundamentally bad or incompetent player - it's just that they've chosen the wrong paradigm, one that greatly limits their ability when they come into contact with the real world.

This same limitation is present in almost every field that I have seen, and considering it is critical. Before you begin investing heavily in a project, you should ask yourself whether this is really the right paradigm to accomplish your goals. Overinvesting in the wrong paradigm has a doubly pernicious effect - not only are your immediate efforts not as effective as they could be, but it also renders you especially vulnerable to the sunk cost fallacy. Keep in mind that even those who are aware of the sunk cost fallacy are not immune to it!

Therefore, when making big decisions, don't just jump into the first paradigm that presents itself, or even the one that seems to make the most sense on initial reflection. Instead, realy truly consider whether this approach is the best one to get you what you want. Look at the goal that you're aiming for, and consider whether there are other ways to achieve it that might be more effective, less expensive, or both.

Here are some sample situations that can be considered paradigm-selection problems:

  • Do you really need to go and get a CS degree in order to become a computer programmer, or will a bootcamp get you started faster and cheaper?
  • Does your organization's restructuring plan really hit the core problems, or is it merely addressing the most obvious surface-level issues?
  • Will aircraft carrier-centric naval tactics be effective in a future large-scale conventional war, or is the aircraft carrier the modern equivalent of the battleship in WW2?

I don't necessarily know the answers to all these questions - note that only one is even framed as a clear choice between two options, and there are obviously other options available even in that case - but I do know that they're questions worth asking! When it comes time to make big decisions, evaluating what paradigms are available and whether the one you've chosen is the right one for the job can be critical.

[Link] Did social desirability effects mask Trump's true support?

4 katydee 10 November 2016 09:56AM
Comment author: The_Lion 24 January 2016 05:41:56PM 18 points [-]

"Gay pride" was, I take it, the granddaddy of them all.

I'm pretty sure that was "black pride". All the successful black people you mentioned are basically dancing bears.

Gay pride is somewhat less so, i.e., they do have Alan Turing and a few artists, but still not very impressive. Heck you had to pad out the list with "Tim Cook, CEO of the world's most successful company", even though it is pretty clearly not his efforts that lead to this state of affairs.

This has much the same ultimately pathetic feel of going to places like Turkey and seeing every mathematics institution named after Cahit Arf, the one decent mathematician the country has ever produced, even though he was a mediocre mathematician by world standards.

Comment author: katydee 25 January 2016 09:56:54PM *  6 points [-]

This is one of the worst comments I've seen on LessWrong and I think the fact that this is being upvoted is disgraceful. (Note: this reply refers to a comment that has since been deleted.)

In response to Why CFAR's Mission?
Comment author: coyotespike 01 January 2016 03:53:16PM 7 points [-]

This is an excellent post, which I'll return to in future. I particularly like the note about the convergence between Superforecasting, Feynman, Munger, LW-style rationality, and CFAR - here's a long list of Munger quotations (collected by someone else) which exemplifies some of this convergence. http://25iq.com/quotations/charlie-munger/

Comment author: katydee 01 January 2016 07:13:16PM 0 points [-]

Excellent link.

Comment author: katydee 07 September 2015 08:33:31PM 3 points [-]

This post seems better suited for the Discussion section.

View more: Next