Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I think the basic problem here is an undissolved question: what is 'intelligence'? Humans, being human, tend to imagine a superintelligence as a highly augmented human intelligence, so the natural assumption is that regardless of the 'level' of intelligence, skills will cluster roughly the way they do in human minds, i.e. having the ability to take over the world implies a high posterior probability of having the ability to understand human goals.

The problem with this assumption is that mind-design space is large (<--understatement), and the prior probability of a superintelligence randomly ending up with ability clusters analogous to human ability clusters is infinitesimal. Granted, the probability of this happening given a superintelligence designed by humans is significantly higher, but still not very high. (I don't actually have enough technical knowledge to estimate this precisely, but just by eyeballing it I'd put it under 5%.)

In fact, autistic people are an example of non-human-standard ability clusters, and even that's only by a tiny amount in the scale of mind-design-space.

As for an elevator pitch of this concept, something like "just because evolution happened design our brains to be really good at modeling human goal systems, doesn't mean all intelligences are good at it, regardless of how good they might be at destroying the planet".

Looking for advice with something it seems LW can help with.

I'm currently part of a program the trains highly intelligent people to be more effective, particularly with regards to scientific research and effecting change within large systems of people. I'm sorry to be vague, but I can't actually say more than that.

As part of our program, we organize seminars for ourselves on various interesting topics. The upcoming one is on self-improvement, and aims to explore the following questions: Who am I? What are my goals? How do I get there?

Naturally, I'm of the opinion that rationalist thought has a lot to offer on all of those questions. (I also have ulterior motives here, because I think it would be really cool to get some of these people on board with rationalism in general.) I'm having a hard time narrowing down this idea to a lesson plan I can submit to the organizers, so I thought I'd ask for suggestions.

The possible formats I have open for an activity are a lecture, a workshop/discussion in small groups, and some sort of guided introspection/reading activity (for example just giving people a sheet with questions to ponder on it, or a text to reflect on).

I've also come up with several possible topics: How to Actually Change Your Mind (ideas on how to go about condensing it are welcome), practical mind-hacking techniques and/or techniques for self-transparency, or just information on heuristics and biases because I think that's useful in general.

You can also assume the intended audience already know each other pretty well, and are capable of rather more analysis and actual math than is average.

Ideas for topics or activities, particularly ones that include a strong affective experience because those are generally better at getting poeple to think about this sort of thing for the first time, are welcome.

ruelian00

It also depends on the jeans. Some jeans are, for some reason, more likely to smell after being worn just once. I have no idea why, but several people I know have corroborated this independently.

ruelian10

Map and territory - why is rationality important in the first place?

ruelian10

Alright, that works too. We're allowed to think differently. Now I'm curious, could you define your way of thinking more precisely? I'm not quite sure I grok it.

ruelian20

So, essentially, there isn't actually any way of getting around the hard work. (I think I already knew that and just decided to go on not acting on it for a while longer.) Oh well, the hard work part is also fun.

ruelian10

This appears to be a useful skill that I haven't practiced enough, especially for non-proof-related thinking. I'll get right on that.

ruelian10

reads the first essay and bookmarks the page with the rest

Thanks for that, it made for enjoyable and thought-provoking reading.

ruelian10

I don't really have good definitions at this point, but in my head the distinction between verbal and nonverbal thinking is a matter of order. When I'm thinking nonverbally, my brain addresses the concepts I'm thinking about and the way they relate to each other, then puts them to words. When I'm thinking verbally, my brain comes up with the relevant word first, then pulls up the concept. It's not binary; I tend to put it on a spectrum, but one that has a definite tipping point. Kinda like a number line: it's ordered and continuous, but at some point you cross zero and switch from positive to negative. Does that even make sense?

ruelian20

Right, that makes much more sense now, thanks.

One of my current problems is that I don't understand my brain well enough for nonverbal thinking not to turn into a black box. I think this might be a matter of inexperience, as I only recently managed intuitive, nonverbal understanding of math concepts, so I'm not always entirely sure what my brain is doing. (Anecdotally, my intuitive understanding of a problem produces good results more often than not, but any time my evidence is anecdotal there's this voice in my head that yells "don't update on that, it's not statistically relevant!")

Does experience in nonverbal reasoning on math lend actually itself to better understanding of said reasoning, or is that just a cached thought of mine?

Load More