Comment author: lsparrish 14 July 2012 11:47:50PM 13 points [-]

PZ's comment regarding the implausibility of speeding up an emulated brain was a real head-scratcher to me, and Andrew G calls him on it in the comments. Apparently (judging from his further comments) what he really meant was that you have to simulate or emulate a good environment, physiology, and endocrine system as well otherwise the brain would go insane.

Of course, we already knew that...

Comment author: C9AEA3E1 15 July 2012 03:06:10AM *  2 points [-]

Seems similar enough to "Every part of your brain assumes that all the other surrounding parts work a certain way. The present brain is the Environment of Evolutionary Adaptedness for every individual piece of the present brain.

Start modifying the pieces in ways that seem like "good ideas"—making the frontal cortex larger, for example—and you start operating outside the ancestral box of parameter ranges. And then everything goes to hell.

So you'll forgive me if I am somewhat annoyed with people who run around saying, "I'd like to be a hundred times as smart!" as if it were as simple as scaling up a hundred times instead of requiring a whole new cognitive architecture."

Eliezer Yudkowsky, Growing Up is Hard

Comment author: [deleted] 01 May 2012 03:55:57AM 6 points [-]

It could be quite valuable to translate that material.

Comment author: C9AEA3E1 01 May 2012 09:17:39PM 1 point [-]

Can someone recommend good Russian learning material? Preferably something that could be found online (books count).

Comment author: Bart119 28 April 2012 02:32:30AM 1 point [-]

Thank you so much for the reply! Simply tracing down the 'berserker hypothesis' and 'great filter' puts me in touch with thinking on this subject that I was not aware of.

What I thought might be novel about what I wrote included the idea that independent evolution of traits was evidence that life should progress to intelligence a great deal of the time.

When we look at the "great filter" possibilities, I am surprised that so many people think that our society's self-destruction is such a likely candidate. Intuitively, if there are thousands of societies, one would expect a high variability in social and political structures and outcomes. The next idea I read, that "no rational civilization would launch von Neuman probes" seems extremely unlikely because of that same variability. Where there would be far less variability is mundane constraints of energy and engineering to launch self-replicating spacecraft in a robust fashion. Problems there could easily stop every single one of our thousand candidate civilizations cold, with no variability.

Comment author: C9AEA3E1 01 May 2012 12:20:41PM *  2 points [-]

Yes, the current speculations in this field are of wildly varying quality. The argument about convergent evolution is sound.

Minor quibble about convergent evolution which doesn't change the conclusion much about there being other intelligent systems out there.

All organisms on Earth share some common points (though there might be shadow biospheres), like similar environmental conditions (a rocky planet with a moon, a certain span of temperatures, etc.), a certain biochemical basis (proteins, nucleic acids, water as a solvent, etc.). I'd distinguish convergent evolution within the same system of life on the one hand, and convergent evolution in different systems of life on the other. We have observed the first, and they both likely overlap, but some traits may not be as universal as we'd be lead to think.

For instance, eyes may be pretty useful here, but deep in the oceans of a world like Europa, provided life is possible there, they might not (an instance of the environment conditioning what is likely to evolve).

Comment author: C9AEA3E1 01 May 2012 11:58:54AM 3 points [-]

To the best of my knowledge, there is nothing quite like SIAI or lesswrong in continental western Europe. People aren't into AI as much as in the US, and if there's rationality thinking being done, it's mostly traditional rationality, skepticism, etc.

Atheism can score high in many countries, as a rule of thumb countries to the north are more atheistic, those to the south (Spain, Portugal, Italy, etc.) are more religious.

There are a few scattered transhumanist as well as a few life-extension organizations, which are loosely starting to cooperate together.

The European commission itself started prioritizing small-scale healthy life extension a year or two ago. This could help focus more people on such questions in the years to come.

Comment author: Bart119 26 April 2012 02:48:22PM 0 points [-]

Hmmmm. Nearly two days and no feedback other than a "-1" net vote. Brainstorming explanations: 1. There is so much wrong with it no one sees any point in engaging me (or educating me). 2. It is invisible to most people for some reason. 3. Newbies post things out of synch with accepted LW thinking all the time (related to #1) 4. No one's interested in the topic any more. 5. The conclusion is not a place anyone wants to go. 6. The encouragement to thread necromancy was a small minority view or intended ironically. 7. More broadly, there are customs of LW that I don't understand. 8. Something else.

Comment author: C9AEA3E1 27 April 2012 09:10:26PM 1 point [-]

Likely, few people read it, maybe just one voted, and that's just one, potentially biased opinion. The score isn't significant.

I don't see anything particularly wrong with your post. Its sustaining ideas seems similar to the Fermi paradox, and the berserker hypothesis. From which you derive that a great filter lies ahead of us, right?

Comment author: C9AEA3E1 27 April 2012 08:07:25PM 5 points [-]

Our bodies need to perform different roles as we age and mature. We'd also need different sets of skills depending on our current developmental phase. It would make sense for our brains to change too, that the developmental path of our brain is planned to make it undergo changes that'd make it more adapted to the tasks it'll have to tackle over different developmental phases.

It'd make sense for our brain to be more fine tuned for grabbing resources from family when we're a kid, to grow as fast as possible, then better tuned to search for sexual partners once we're getting mature, and lastly, more fine tuned to take care of our kids once we got them.

And if there's a mechanism which makes our brain undergo developmental changes along a pre-planned path, then we might also expect that past the age at which we reproduce, there'd be less and less evolutionary pressure to shape that developmental trajectory.

I don't think either that evolution would have much of a reason to cleanly engineer a stable end-state after which development just entirely stops, and leaves you with a well-adjusted, perfectly functional body or brain. That may not be a trivial task after all.