Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: CBHacking 20 March 2017 08:48:17AM *  1 point [-]

Moderately true of Seattle as well (two group houses, plus some people living as housemates or whatever but not explicitly a Rationalist Group House). I'm not sure if our community is big enough for something like this but I love this idea and it would be a point in favor of moving the bay area if there was one there (that I had a chance to move into) but not one here.

Comment author: CBHacking 20 March 2017 08:40:45AM 2 points [-]

Hell, it's not even just the bay area; Seattle has two explicitly-rationalist-group-houses and plenty of other people who live in more "normal" situations but with other rationalists (I found my current flatmate, when my old one moved out, through the community). Certainly the bay area rationalist community is large and this sort of living situation is far from universal, but I've certainly heard of several even though I've never actually visited any.

Comment author: username2 20 March 2017 07:26:43AM 1 point [-]

Nope, not in my social circle. I know quite a few self-described rationalists. Most are not in shared living setups, although 2 are. Those two have regular roommate situations where the roommates were not selected for being rationalist.

Frankly there are all sorts of alarm bells going off about the idea of seeking out shared living situations where everyone is from the same rationalist community. Smells of cultism... I on the other hand highly value interacting with people of different backgrounds and base belief systems.

Comment author: Artaxerxes 20 March 2017 05:31:18AM 1 point [-]

The segment on superintelligence starts at 45:00, it's a rerun of a podcast from 2 years ago. Musk says it's a concern, Bill Nye commenting on Musk's comments about it afterwards says that we would just unplug it and is dismissive. Neil is similarly skeptical and half heartedly plays devils advocate but clearly agrees with Nye.

In response to comment by math54 on Am I Really an X?
Comment author: gjm 20 March 2017 03:59:38AM 0 points [-]

Humanity did reasonably well for millennia without the need for such a word.

And also without (to pick a few kinda-random examples) "topologist", "neoreactionary", "rationalist". Should we throw those words away too?

Comment author: jaibot 20 March 2017 03:57:42AM 2 points [-]

Seattle is thinking about putting together a Community Center. This would basically be a house that we collectively rent out, with maybe one or two people living there at below-market rates to take care of the space and do upkeep. Here's a post on Tumblr outlining the thinking so far by one of the people spearheading the effort: http://fermatas-theorem.tumblr.com/post/158612649028/what-if-seattle-earationality-got-a-community

Comment author: komponisto 20 March 2017 03:44:00AM 1 point [-]

The first link is MM saying what EY would later say in No Safe Defense, Not Even Science.

Comment author: 27chaos 20 March 2017 03:36:32AM 0 points [-]

Disasters and miracles follow similar rules. Charles Babbage, in his Ninth Bridgewater Treatise of 1837, considered the nature of miracles (which, as a computer scientist, he viewed as pre-determined but rarely-called subroutines) and urged us "to look upon miracles not as deviations from the laws assigned by the Almighty for the government of matter and of mind; but as the exact fulfilment of much more extensive laws than those we suppose to exist." It's that question of characteristic scale.

George Dyson, comment on Taleb's "The Fourth Quadrant".

Comment author: jsteinhardt 20 March 2017 02:31:47AM 3 points [-]

Any attempt to enforce rationalists moving in is illegal.

Is this really true? Based on my experience (not any legal experience, just seeing what people generally do that is considered fine) I think in the Bay Area the following are all okay:

  • Only listing a house to your friends / social circle.
  • Interviewing people who want to live with you and deciding based on how much you like them.

The following are not okay:

  • Having a rule against pets that doesn't have an exception for seeing-eye dogs.
  • Explicitly deciding not to take someone as a house-mate only on the basis of some protected trait like race, etc. (but gender seems to be fine?).
Comment author: gworley 20 March 2017 02:19:54AM 0 points [-]

Basically agree, and it's nearly the same point I was trying to get at, though by less supposing utility functions are definitely the right thing. I'd leave open more possibility that we're wrong about utility functions always being the best subclass of preference relations, but even if we're wrong about that our solutions must at least work for utility functions, they being a smaller set of all possible ways something could decide.

Comment author: mayleaf 20 March 2017 01:29:29AM *  2 points [-]

Yeah, a lot of Bay Area rationalistsphere people currently live in group houses. I have the impression that this is true of NYC rationalistsphere as well, but less true in other cities.

And yeah, I suspect that a lot of confusion arises from eliding "people who read LessWrong and other rationalist blogs and identify as rationalists" and "a specific social circle of Bay Area and NYC inhabitants who know each other IRL (even if they originally met through online rationalist communities)." The latter group does in fact tend to live in group housing; I have very little idea about the former.

Comment author: RomeoStevens 20 March 2017 01:27:30AM *  4 points [-]

Additionally: if only one member seems enthusiastic about thinking/planning/enforcing this kind of stuff that is a very bad sign. In such a situation when that person burns out the community slowly dies.

Comment author: mayleaf 20 March 2017 01:19:50AM *  3 points [-]

As another Bay Arean rationalist, I can confirm that a large part of my rationalist social circle lives in group houses in the Berkeley/Oakland area. I'm a bit surprised that you haven't encountered this as well?

Generally the group houses are 3-5 rationalists in their twenties or early thirties living together -- sharing common spaces, but having private bedrooms (or bedrooms shared only with a romantic partner.)

I suspect that the prevalence of group housing is in part due to Bay Area rent being really high (making it more attractive to share an apartment/house as opposed to rent a one-bedroom on one's own). I also have the vague impression that current 20-30-year-olds in the US are more likely to live in group housing than has been true in previous generations? (Many of my non-rationalist friends in this age group also live in group houses.)

Comment author: mayleaf 20 March 2017 01:11:43AM 0 points [-]

I am interested.

Comment author: Tem42 20 March 2017 01:02:54AM 3 points [-]

Very interested, but not willing to move more than 2-3 hours away; am nowhere near CA.

Comment author: OldManNick 19 March 2017 11:32:20PM *  0 points [-]

``` html2text $URL| pandoc -t markdown ```

where html2text and pandoc can be found on github.

Comment author: Good_Burning_Plastic 19 March 2017 09:16:26PM *  0 points [-]

I believe he posted on OB when EY was posting there.

Yes but it's not like there was a lot of love lost between MM and EY (or RH).

Comment author: math50 19 March 2017 09:08:06PM 1 point [-]

Which keep getting deleted by the mods for no good reason.

Comment author: TheAncientGeek 19 March 2017 08:47:14PM 0 points [-]

2% of LWers called themselves neoreactionary,

That's compatible with a lot of neoreactionaries being lesswrongers.

To the best of my knowledge, Moldbug didn't post on LW.

I believe he posted on OB when EY was posting there.

Comment author: Alicorn 19 March 2017 07:55:52PM 1 point [-]

I considered adding [citation needed] after that sentence but thought it was probably pretty obvious. I guess not everybody goes to rationalist group house parties all the time.

Comment author: malcolmocean 19 March 2017 07:39:52PM 2 points [-]

Am very interested in this in various ways, whether as a participant or as a consultant to the challenge of how to effectively live together (something I've been studying extensively for the last few years). Not actually currently able to move to the states easily, so there's that.

Comment author: Benquo 19 March 2017 06:54:31PM *  2 points [-]

Good ideas do not need lots of lies told about them in order to gain public acceptance. I was first made aware of this during an accounting class. We were discussing the subject of accounting for stock options at technology companies. There was a live debate on this subject at the time. One side (mainly technology companies and their lobbyists) held that stock option grants should not be treated as an expense on public policy grounds; treating them as an expense would discourage companies from granting them, and stock options were a vital compensation tool that incentivised performance, rewarded dynamism and innovation and created vast amounts of value for America and the world. The other side (mainly people like Warren Buffet) held that stock options looked awfully like a massive blag carried out my management at the expense of shareholders, and that the proper place to record such blags was the P&L account.

Our lecturer, in summing up the debate, made the not unreasonable point that if stock options really were a fantastic tool which unleashed the creative power in every employee, everyone would want to expense as many of them as possible, the better to boast about how innovative, empowered and fantastic they were. Since the tech companies’ point of view appeared to be that if they were ever forced to account honestly for their option grants, they would quickly stop making them, this offered decent prima facie evidence that they weren’t, really, all that fantastic. [...]

Fibbers’ forecasts are worthless. Case after miserable case after bloody case we went through, I tell you, all of which had this moral. Not only that people who want a project will tend to make innacurate projections about the possible outcomes of that project, but about the futility of attempts to “shade” downward a fundamentally dishonest set of predictions. If you have doubts about the integrity of a forecaster, you can’t use their forecasts at all. Not even as a “starting point”. By the way, I would just love to get hold of a few of the quantitative numbers from documents prepared to support the war and give them a quick run through Benford’s Law.

Application to Iraq This was how I decided that it was worth staking a bit of credibility on the strong claim that absolutely no material WMD capacity would be found, rather than “some” or “some but not enough to justify a war” or even “some derisory but not immaterial capacity, like a few mobile biological weapons labs”. My reasoning was that Powell, Bush, Straw, etc, were clearly making false claims and therefore ought to be discounted completely, and that there were actually very few people who knew a bit about Iraq but were not fatally compromised in this manner who were making the WMD claim. [...]

The Vital Importance of Audit. Emphasised over and over again. Brealey and Myers has a section on this, in which they remind callow students that like backing-up one’s computer files, this is a lesson that everyone seems to have to learn the hard way. Basically, it’s been shown time and again and again; companies which do not audit completed projects in order to see how accurate the original projections were, tend to get exactly the forecasts and projects that they deserve. Companies which have a culture where there are no consequences for making dishonest forecasts, get the projects they deserve. Companies which allocate blank cheques to management teams with a proven record of failure and mendacity, get what they deserve.

[...] The raspberry road that led to Abu Ghraib was paved with bland assumptions that people who had repeatedly proved their untrustworthiness, could be trusted. There is much made by people who long for the days of their fourth form debating society about the fallacy of “argumentum ad hominem”. There is, as I have mentioned in the past, no fancy Latin term for the fallacy of “giving known liars the benefit of the doubt”, but it is in my view a much greater source of avoidable error in the world. Audit is meant to protect us from this, which is why audit is so important.

Comment author: Dustin 19 March 2017 05:52:10PM 0 points [-]

I'm skeptical that meetups are representative of rationalists in general.

Comment author: Dustin 19 March 2017 05:44:31PM 1 point [-]

Rationalists like to live in group houses.

Do they? This seems like a pretty strong claim to make.

Comment author: Lumifer 19 March 2017 04:19:05PM 1 point [-]

When given a quantitative argument

Numbers are not particularly magical and being quantitative doesn't imply the argument is more likely to be correct. After all, "there are lies, damn lies, and statistics".

Comment author: Douglas_Knight 19 March 2017 04:13:24PM *  0 points [-]

Is that the relevant feature? If we disagree about that, then this phrase is not itself a clear opinion. The original version was extreme, which is how I always understood it.

What do you mean by "clear"? As I understand it, the point is to be clear about some thing and not clear about others — in particular, not to clearly state all disclaimers.

Added: the modern origin gives an explanation that might fit the word zealous. He emphasizes the energy and commitment necessary to test the claims.

Comment author: ozymandias 19 March 2017 03:12:06PM 3 points [-]

I think an accurate qualitative argument is better than a sourceless quantitative argument.

Comment author: The_Jaded_One 19 March 2017 11:42:06AM 2 points [-]

We had this problem at work quite a few times. Bosses are reluctant to let me do something which will make things run more smoothly, they want new features instead.

The when things break they're like "What! Why is it broken again?!"

Comment author: jkadlubo 19 March 2017 10:09:50AM 3 points [-]

Interested. Already have 2 kids. Live in Poland and would like to stay in Europe.

Comment author: WhySpace 19 March 2017 07:55:26AM *  1 point [-]

I don't see any reason why AI has to act coherently. If it prefers A to B, B to C, and C to A, it might not care. You could program it to prefer that utility function.*

If not, maybe the A-liking aspects will reprogram B and C out of it's utility function, or maybe not. What happens would depend entirely on the details of how it was programmed.

Maybe it would spend all the universe's energy turning our future light cone from C to B, then from B to A, and also from A to C. Maybe it would do this all at once, if it was programmed to follow one "goal" before preceding to the next. Or maybe different parts of the universe would be in different stages, all at the same time. Think of it like a light-cone blender on pure.

Our default preferences seem about that coherent, but we're able to walk and talk, so clearly it's possible. It explains a lot of the madness and incoherence of the way the world is structured, certainly. Luckily, we seem to value coherence, or at least are willing to sacrifice on having our cake and eating it too when it becomes clear that we can't have it both ways. It's possible an subtly incoherent AGI would operate at cross purposes for a long time before discovering and correcting it's utility function, if it valued coherence.

However, MIRI isn't trying to program a sane AGI, not explore all possible ways an AI can be insane. Economists like to simplify human motives into idealized rational agents, because they are much, much simpler to reason about. The same is true for MIRI, I think.

I've given this sort of thing a little thought, and have a Evernote note I can turn into a LW post, if there is interest.

* I use the term "utility function broadly, here. I guess "programming" would be more correct, but even an A>B>C>A AI bears some rough resemblance to a utility function, even if it isn't coherent.

Comment author: Alicorn 19 March 2017 06:44:43AM 3 points [-]

I think this was helped along substantially by personal acquaintance with and HPMOR fandom of the landlord, which seems hard to replicate on purpose.

Comment author: Convolution 19 March 2017 06:43:46AM 1 point [-]

Thanks a lot. I was nervous about posting here.

Comment author: bogus 19 March 2017 05:40:08AM *  2 points [-]

"LessWrong urged its community members to think like machines rather than humans. Contributors were encouraged to strip away self-censorship, concern for one’s social standing, concern for other people’s feelings, and any other inhibitors to rational thought. It’s not hard to see how a group of heretical, piety-destroying thinkers emerged from this environment — nor how their rational approach might clash with the feelings-first mentality of much contemporary journalism and even academic writing."

Yeah, that seems backwards to me. Contemporary mainstream politics, influenced by centralized institutional arrangements like journalism or academia (what NRx call 'The Cathedral'), is much closer to a general idea of "Rationalism in Politics" (to use Michael Oakshott's term) than anything from the NRx camp. Of course one could argue that these institutions aren't being very rational after all, but more to the point, their overall stance is one that values the results of formalized, logical (and thus, 'rational') deliberation and of ambitious "social engineering" efforts - as opposed to, say, preserving or reviving those enduring traditions that have "stood the test of time" and thus proven some kind of inherent worth or sustainability.

Comment author: username2 19 March 2017 05:38:44AM 0 points [-]

I was being quite serious. When given a quantitative argument you responded with a grab bag of abstract objections not backed by data but vaguely supporting your original viewpoint. A natural human response designed to keep one from changing their mind, generally called rationalization. I encourage becoming aware of when this is happening and use that awareness to improve your model of the world.

Comment author: cata 19 March 2017 04:33:50AM 2 points [-]

I am really interested in this and would be likely to want to move into such a place if it existed anywhere in the Bay Area in the next few years.

Comment author: UmamiSalami 19 March 2017 04:30:34AM 0 points [-]

In fact, it seems to me that the less intelligent an organism is, the easier its behavior can be approximated with model that has a utility function!

Only because those organisms have fewer behaviors in general. If you put a human in an environment where its options and sensory inputs were as simple as those experienced by apes and cats, humans would probably look like equally simple utility maximizers.

Comment author: Vaniver 19 March 2017 03:53:26AM 9 points [-]

Quoting myself from Facebook:

I think identifying the neoreactionaries with LessWrong is mostly incorrect. I know NRxers who found each other on LW and made side blogs, but that's because I know many more LWers than NRxers.

In the 2014 survey, only 2% of LWers called themselves neoreactionary, and I think that's dropped as they've mostly moved off LW to other explicitly neoreactionary sites that they set up. LW had a ban on discussing politics that meant there weren't any serious debates of NRx ideas. To the best of my knowledge, Moldbug didn't post on LW. It probably is the case that debiasing pushed some people in the alt-right direction, but it's still silly to claim it's the normal result.

Comment author: Vaniver 19 March 2017 03:20:03AM 3 points [-]

I'm interested, and have been thinking for a while of how to structure it and where to put it / what properties to focus on (in Berkeley, at least). I think there's a pretty strong chance we can build a rationalist village or two (or three or...).

Comment author: Vaniver 19 March 2017 03:13:44AM 4 points [-]

I believe this is what happened with Godric's Hollow--a four unit building turned, one by one, into a four unit rationalist building.

Comment author: drethelin 19 March 2017 02:52:34AM 0 points [-]

This is why we need downvotes.

Comment author: username2 19 March 2017 02:05:30AM 0 points [-]

Sounds a lot more like rationalization than rationalism.

Comment author: gjm 19 March 2017 01:38:41AM 8 points [-]

Note also that "4% of adults are sexually attracted to children" is a very different statement from "4% of adults are likely to molest children if left alone with them".

(I suspect rather more than 4% of adults are sexually attracted to Angelina Jolie[1], but that doesn't mean they'd molest her if left alone in a room with her.)

[1] Chosen by putting "famous actress" into Google and picking the first name it gave me. If she isn't your type -- she isn't particularly mine, as it happens -- feel free to imagine I chose a different name.

Comment author: Elo 19 March 2017 01:37:17AM 0 points [-]

Can you post a summary of the link?

Comment author: gjm 19 March 2017 01:31:34AM 0 points [-]

Some people, for whatever reason, find themselves with a strong conviction that they should be, or that they really are, of a different gender from the one their body-type seems to indicate. Those people are called "trans", short for "transgender", for kinda obvious reasons. Most people don't. Those people are called "cis", because traditionally when an opposite of "trans" is needed "cis" is it. (Cisalpine. Cismontane. Cis fats.)

Do please explain what in the foregoing paragraph depends on "feminism" in any sense that anyone could object to, or requires "irrationalism".

Comment author: SolveIt 19 March 2017 01:23:41AM 2 points [-]

I am interested!

Comment author: gwern 19 March 2017 01:06:36AM 0 points [-]

That said, excerpting some of the praise of myself seems harmless enough, so as part of a sort of experiment in whether adding social proof (such as in some tweets) will help my Patreon profile, I've excerpted a number of them to https://www.patreon.com/gwern

Comment author: Lumifer 19 March 2017 01:03:36AM 2 points [-]

It will select for rich people.

Comment author: Lumifer 19 March 2017 01:01:13AM 0 points [-]

Oh, I always choose to giggle.

Comment author: Lumifer 19 March 2017 01:00:36AM 0 points [-]

That kid's head merging with his ass?

Don't forget the non-newtonian fluid.

He had that under control

That's 'cause the Powerpuff Girls weren't there.

Comment author: Mitchell_Porter 19 March 2017 12:12:14AM *  0 points [-]

Give it all to Fabian Tassano.

Comment author: Douglas_Knight 18 March 2017 10:45:51PM 2 points [-]

No problem! FWIW is a disclaimer that I'm just a completest and not saying anything in particular!

Comment author: morganism 18 March 2017 10:28:51PM 0 points [-]

Native GPU programming with CUDAnative.jl

http://julialang.org/blog/2017/03/cudanative

"You can now write your CUDA kernels in Julia, albeit with some restrictions, making it possible to use Julia’s high-level language features to write high-performance GPU code."

"The programming support we’re demonstrating here today consists of the low-level building blocks, sitting at the same abstraction level of CUDA C. You should be interested if you know (or want to learn) how to program a parallel accelerator like a GPU, while dealing with tricky performance characteristics and communication semantics."

Comment author: James_Miller 18 March 2017 10:25:23PM 7 points [-]

Because of Trump's surprise victory, hundreds of books are destine to be written on the alt-right and any future scholar of such will certainly read the linked article so here would be a good place to correct the record.

Comment author: lifelonglearner 18 March 2017 10:22:44PM 1 point [-]

That's interesting. I was pegging Attractors as physical actions, but I think the analogy can be loosely applied to mental concepts too (as I think you're doing here.)

I think that regularities can be strategically used as you suggest to create additional "anchors" to helpful habits in the real world. (EX: Having a running timer probably shouldn't actually affect my running habits in the normative sense, yet having a timer when running really makes it feel more official / Formal.)

Comment author: morganism 18 March 2017 10:22:23PM 1 point [-]

"A competition from the Global Challenges Foundation, founded in 2012 by the Szombatfalvy, is calling for solutions to the world's most pressing problems, like conflict, climate change and extreme poverty."

https://globalchallenges.org/en

Comment author: evand 18 March 2017 10:22:17PM 1 point [-]

On the legality of selecting your buyers: What if you simply had a HOA (or equivelent) with high dues, that did rationalist-y things with the dues? Is that legal, and do you think it would provide a relevant selection effect?

Comment author: James_Miller 18 March 2017 10:21:41PM 1 point [-]

You are right. Sorry I missed it. Given how prominent Milo and the alt-right are, however, I do think the link deserves a top level post.

Comment author: Douglas_Knight 18 March 2017 10:02:48PM 3 points [-]

FWIW, this was posted at the time.

Comment author: drethelin 18 March 2017 09:49:45PM 1 point [-]

most single family households have a lot fewer than 10-20 people in them.

Comment author: drethelin 18 March 2017 09:44:28PM *  4 points [-]

I think the statistics you quote are exaggerated in order to terrify. When I tried to look up "4% of adults are sexually attracted to children," for example, I found nothing. Similarly, the news is often full of stranger danger fears because terror is what gets attention and therefore revenue and funding. And as others have said, they also include stuff like 18 year olds having sex with 17 year olds, which some people may find unacceptable but I don't.

Comment author: Zack_M_Davis 18 March 2017 09:26:19PM 5 points [-]

I agree! Indeed, your comment is a response to the something different that I wrote down! If I cared more about correcting this particular historical error, I would do more research and write something more down in a place that would get more views than this Less Wrong Discussion thread. Unfortunately, I'm kind of busy, so the grandparent is all that I bothered with!

Comment author: Elo 18 March 2017 09:20:47PM 12 points [-]

history is written by the people who write it down. if you want to change history; write something different down.

Comment author: RomeoStevens 18 March 2017 09:19:29PM 1 point [-]

Common limiting beliefs can be seen as particularly strong attractors along certain dimensions of mindspace that coping mechanisms use to winnow the affordance space down to something manageable without too much cognitive overhead. Regularities in behaviors that don't serve a direct purpose could also be seen as spandrels from our training data clustering things that don't necessarily map directly to causality. Ie you can get animals to do all sorts of wacky things with clicker training which then persist even if you start only rewarding a subset of the actions if the animal has no obvious way of unbundling the actions.

Comment author: Zack_M_Davis 18 March 2017 09:00:05PM 11 points [-]

But, but, this is not historically accurate! I'm sure there's a much greater overlap between Less Wrong readers and Unqualified Reservations readers than you would expect between an arbitrary pairing of blogs, but the explanation for that has to look something like "Yudkowsky and Moldbug both attract a certain type of contrarian nerd, and so you get some links from one community to the other from the few contrarian nerds that are part of both." The causality doesn't flow from us!

Comment author: James_Miller 18 March 2017 07:31:27PM 6 points [-]

"Elsewhere on the internet, another fearsomely intelligent group of thinkers prepared to assault the secular religions of the establishment: the neoreactionaries, also known as #NRx."

"Neoreactionaries appeared quite by accident, growing from debates on LessWrong.com, a community blog set up by Silicon Valley machine intelligence researcher Eliezer Yudkowsky. The purpose of the blog was to explore ways to apply the latest research on cognitive science to overcome human bias, including bias in political thought and philosophy."

"LessWrong urged its community members to think like machines rather than humans. Contributors were encouraged to strip away self-censorship, concern for one’s social standing, concern for other people’s feelings, and any other inhibitors to rational thought. It’s not hard to see how a group of heretical, piety-destroying thinkers emerged from this environment — nor how their rational approach might clash with the feelings-first mentality of much contemporary journalism and even academic writing."

This article currently has 32,760 Facebook shares.

Comment author: Jonathan_Lee 18 March 2017 07:29:23PM 4 points [-]

Even if it's the case that the statistics are as suggested, it would seem that a highly effective strategy is to ensure that there are multiple adults around all the time. I'll accept your numbers ad arguendo (though I think they're relevantly wrong).

If there's a 4% chance that one adult is an abuser, there's a 1/625 chance that two independent ones are, and one might reasonably assume that the other 96% of adults are unlikely to let abuse slide if they see any evidence of it. The failure modes are then things like abusers being able to greenbeard well enough that multiple abusers identify each other and then proceed to be all the adults in a given situation. Which is pretty conjunctive as failures go, and especially in a world where you insist that you know all the adults personally from before you started a baugruppe rather that letting Bob (and his 5 friends who are new to you) all join.

You also mention "selection for predators", but that seems to run against the (admittedly folk) wisdom that children at risk of abuse are those that are isolated and vulnerable. Daycare centres are not the central tendency of abuse; quiet attics are.

Comment author: Rubix 18 March 2017 07:22:01PM 3 points [-]

I grew up in a hippie commune and I recommend this!

Comment author: Rubix 18 March 2017 07:18:55PM 1 point [-]

Endorsed.

Comment author: tristanm 18 March 2017 06:44:44PM 0 points [-]

People buy lottery tickets because no one can accurately "feel" or intuit incredibly small probabilities. We (by definition) experience very few or no events with those probabilities, so we have nothing on which to build that intuition. Thus we approximate negligible but non zero probabilities as small but non negligible. And that "feeling" is worth the price of the lottery ticket for some people. Some people learn to calibrate their intuitions over time so negligible probabilities "feel" like zero, and so they don't buy lottery tickets. The problem is less about utility functions and more about accurate processing of small probabilities.

Comment author: Viliam 18 March 2017 06:35:37PM *  0 points [-]

More crazy than this?

Lorenz demonstrated how incubator-hatched geese would imprint on the first suitable moving stimulus they saw within what he called a "critical period" between 13–16 hours shortly after hatching.

Comment author: Viliam 18 March 2017 06:28:56PM 0 points [-]

I am actually trying (completely unsuccessfully, but there is always hope) to get rid of my Less Wrong addiction. That would be a step backwards.

Comment author: Viliam 18 March 2017 06:20:45PM *  12 points [-]

I want to urge people to not dismiss this without a thought. And it's not just about children.

There are already a few sexual predators hanging around with the rationalist community. I can't say names, because it is typically a "they said, they said" situation, and these types usually have a lot of practice at threatening legal consequences for "slander". (But if you know someone who used to be around and suddenly lost all interest at coming to your meetups, it might make sense to ask them discreetly whether they had a bad experience with someone specifically.)

I personally often don't care much about the statistics for general population, because we are obviously not average. Problem is, "not average" doesn't in itself show the direction. For general intelligence, we are obviously smarter, and that generally correlates with lower (detected?) crime. On the other hand, we also seem to score quite high for unusual sexual behavior in general.

As long as each family has a door they can close (and everything necessary to survive the day is inside), living in a community doesn't seem worse than simply living with neighbors. But there are good reasons why neurotypical people require long time before they start trusting someone; so "they are a member of the same community" should never be used as a replacement for "I have a lot of personal experience from gradually deepening interaction with this specific person". In other words, just because someone says "hi, I also like Less Wrong", doesn't mean I would invite them to my home and leave them alone with my child. Some nerdy people may need to be explicitly reminded of this.

Comment author: Douglas_Knight 18 March 2017 06:17:52PM 0 points [-]

Arguably, doing childbirth the "unnatural" way can mess up with your or your baby's instincts, because they were evolved expecting certain circumstances. Some instincts depend on timing. I am not sure if this is the situation here, but some people prefer to play it safe.

That seems totally crazy to me. Claims about transfer of bacteria (both good and bad) are much more plausible differences with a C-section.

Comment author: Douglas_Knight 18 March 2017 06:15:38PM 0 points [-]

Without claiming that it is directly relevant to the question, let me quote Atul Gawande, from "The Score: How childbirth went industrial"

Yes, he said, many studies did show fabulous results for forceps. But they only showed how well forceps deliveries could go in the hands of highly experienced obstetricians at large hospitals....Forceps deliveries are very difficult to teach—much more difficult than a C-section

Gawande is great. Collect them all. Also, Lewis Thomas.

Comment author: Asymmetric 18 March 2017 06:07:37PM 2 points [-]

Am interested!

Comment author: Douglas_Knight 18 March 2017 06:06:47PM 0 points [-]

We need more mods in Europe. You used to be a moderator, so why not reclaim that power?

Comment author: Douglas_Knight 18 March 2017 06:03:29PM 0 points [-]

I thought Cochran was a science-jock who couldn't imagine being gay

Whereas you can imagine it so easily that you didn't bother to look at the real world, just as you can imagine Cochran so easily you didn't bother to look at him

how seriously I should take the argument that there has to be (in gwern's words) a "mechanism... to offset the huge fitness penalty"

90% of biologists don't believe in evolution, either, but progress comes from those who do.

Comment author: Viliam 18 March 2017 05:49:49PM *  14 points [-]

I have recently read Creating a Life Together: Practical Tools to Grow Ecovillages and Intentional Communities, which is a book containing experience and advice for people wanting to build a community. The book is about ecological communities, which may differ in some aspects from the rationalist ones, but I believe most things are valid generally.

Some points I remember:

Do not overestimate people's commitment, no matter what they say. When the moment comes to actually put down the large amounts of money, don't be surprised if most of them suddenly change their minds.

Do your research in advance -- how much the project will cost, what kinds of documents and permissions you will need, and whether your plan is actually legal. (Ask people already living in similar communities. Actually, visit them for a few days, to get a near-mode experience. All of you.)

Good fences make good neighbors. Whatever were your original agreements, expect people to change their minds later and to remember something different than you do. Then you will need a paper record.

For any kind of group decisions, you need very precise rules for (1) who is and who isn't a member, how to become one and how to stop being one; and (2) what happens in case of prolonged disagrement. Outside view says that "we will do everything by consensus" is magical thinking predictably leading to a disaster.

It helps to identify your vision, and describe it in "vision documents" as clearly as possible. You might be surprised that people who previously seemed to agree with non-specific details, will suddenly find things they object against. (Better to find it now than after you have all moved.) Also, this will be helpful in future to explain your community to potential new members.

It is a bad idea to introduce power imbalances, such as "the rich members volunteer to intially pay for the poorer ones" or "someone can lend their unused private building to the community", because that can make later intra-group negotiations really unpleasant (e.g. when you have a vote about something the rich members have a strong opinion about, and they end up in the minority). If there is a need to lend money between members, do it completely officially, so that the fact that "X owes money to Y" cannot be used as a leverage against X.

It is probably a good idea to have together some lessons on communication skills. You need to be able to talk about sensitive topics where you disagree, without it making you feel disconnected. But you also need to hold each other accountable for things you agreed upon.

Filter people for emotional maturity. Seriously. Some people can cause insane amounts of unnecessary drama. And that applies not just for founders; you should also agree on some selection process for new members in the future. Also, newcomers should become provisional members first, participate in the community life and contribute some work, before they become full members. (Good interview questions: how have you supported yourself financially in the past? describe your long-term relationshops, school and work experience.)

There are different options: You can buy or rent several houses or flats for individual members of families, and then one extra place which will be common. Or you can buy a piece of land with several houses. Or a piece of land without houses, and build them. Or you could buy an office building or an abandoned factory, and then rebuild it. Members can own their places; or you could together create a legal entity that owns everything, and all members rent it from that entity. That entity may be able to take a loan.

Have a debate about what is your position on:

  • preferred distance from: schools, shops, nature, traffic nodes, other important places
  • lifestyle: vegetarianism / veganism, families, pets, sexual behavior, drug use...
  • financial issues: will everyone contribute equally, do member rights depend on their respective contributions, which property or expenses specifically are shared
  • politics: what about religion, is it okay if members are politically active, is it okay to publicly support politically active people

Have monthly meetings, with an agenda and a facilitator; only members can vote; take notes and archive them. (The logs will be useful to show to new members in the future.) Specifically, keep records about agreed upon tasks and dates. To make sure everyone is trivially involved even before you buy something, require a symbolic financial contribution from the members (remember, only those can vote), e.g. $100 for joining, and $10 as a monthly fee. Keep financial records.

How to create positive emotional bonds: talk about your life; cook and eat together. (Once in a while.)

In response to comment by gjm on Am I Really an X?
Comment author: SnowSage4444 18 March 2017 05:34:09PM 1 point [-]

cis

What is Feminism doing here? What are we, the Rationalists, or the Irrationalists?

Comment author: SnowSage4444 18 March 2017 05:32:59PM 0 points [-]

http://archiveofourown.org/works/10343850/chapters/22880244

Chapter two is up. Would you like to read it, or would you like to giggle about your Blossom, Bubbles, and Buttercup fantasies?

View more: Prev | Next