Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: reaver121 20 March 2010 09:45:14AM *  3 points [-]

I agree that it would be a good idea to prevent hurting your credibility by signaling that you are either throwing out an idea to be torn apart or that you have thought long and hard about it. However, I still would err on the side of letting an idea out early. There are also downsides with thinking about an idea for too long :

  • you are less likely to find problems in your idea on your own
  • you possibly can get emotionally invested in your idea so you will have trouble in letting go when someone shoots it down
  • lost thinking time when your idea turns out to be false

A classical example is the Perpetuum Mobile. Time and time again there are people who believe that it will work. They invest time & money in something that's impossible as anyone with even a passing familiarity with thermodynamics can tell you.

I can only see one downside in letting it out early (besides hurt credibility) and that's that your future ideas will be taken less seriously. If you acquire a reputation for saying a lot of wrong and/or stupid ideas people will be quicker to just ignore you.

I think that the best way is to scrutinize your idea for basic soundness so that there are no obvious holes. Then the damage is minimal if it turns out to be wrong (obvious within your community off course. If you even dare to suggest Perpetuum Mobile with physicists if give you 5 seconds before they laugh you out in your face). Also, with the rise of the internet & libraries it's fairly easy to lookup if your idea wasn't already thought off and shot down.

Comment author: idlewire 24 March 2010 04:53:09PM 0 points [-]

So the real question is: "How will one's credibility be affected in the environment where the idea is presented?" which most likely depends on one's current credibility.

As of now, I don't have much karma so my risk of putting out poor ideas is more detrimental to this screen name. Eliezer could probably sneak in an entire subtly ludicrous paragraph that might go unnoticed for a while.

He has a history in reader's minds as well as the karma metric to make people ignore that flash in the back of their minds that something was off. They are more likely to think it was their own abberant thinking or that they had a flawed interpretation of a non-ludicrous idea he was trying to convey.

So it guess it just depends on how solid you think your idea and reputation are in making the decision on when to release an idea to a particular audience.

Comment author: taw 20 March 2010 01:39:28AM 5 points [-]

Time spent thinking about something correlates far too much with how much a person likes an idea. People who believe strongly in the Holy Trinity have spent far more time on average thinking about it than people who don't.

So, you will end up very strongly biased if you follow own advice (but then I haven't thought about it much at all).

Comment author: idlewire 24 March 2010 04:25:50PM *  0 points [-]

This kind of brings up the quality of thought that is spent on a subject. Someone with a strong ability to be self-criticizing can more effectively find flaws and come to better conclusions quicker. Those who contemplate on ideas with wrong, but unshakeable (or invisible rather) assumptions, will stew in poor circles until death. The idea of a comforting or powerful diety, unfortunately, sticks so hard when indoctrinated early and consistently.

Comment author: Kevin 21 March 2010 03:36:17AM 2 points [-]

I noticed that the variance in the amount of time different people spend thinking through new ideas before they speak is quite high.

This is one of the basic characteristics of introversion vs. extroversion where extroverts tend to have less of a filter on their thoughts before speaking.

Comment author: idlewire 24 March 2010 04:15:40PM *  1 point [-]

While I'd have a difficult time pinning myself as either introvert or extrovert, I notice when I'm with a comfortable crowd, ideas will fall out of my mouth with so little processing that many sentences end with "... wait, nevermind, scratch that." I'll use my close aquaintences as easy parallel processing or to quickly look at ideas from obvious viewpoints that I tend to easily overlook.

When I'm in an unfamiliar group or setting, I'll often spend so long revising what I want to say that the conversation moves on and I've hardly said a word for 20 minutes.

Comment author: idlewire 03 August 2009 03:55:07PM 4 points [-]

This reminds me of an idea I had after first learning about the singularity. I assumed that once we are uploaded into a computer, a large percentage of our memories could be recovered in detail, digitized, reconstructed and categorized and then you would have the opportunity to let other people view your life history (assuming that minds in a singularity are past silly notions of privacy and embarrassment or whatever).

That means all those 'in your head' comments that you make when having conversations might be up for review or to be laugh at. Every now and then I make comments in my head that are intended for a transhuman audience when watching a reconstruction of my life.

The idea actually has roots in my attempt to understand a heaven that existed outside of time, back when I was a believer. If heaven was not bound by time and I 'met the requirements', I was already up there looking down at a time-line version of my experience on earth. I knew for sure I'd be interested in my own life so I'd talk to the (hopefully existing) me in heaven.

On another note, I've been wanting to write a sci-fi story where a person slowly discovers they are an artificial intelligence led to believe they're human and are being raised on a virtual earth. The idea is that they are designed to empathize with humanity to create a Friendly AI. The person starts gaining either superpowers or super-cognition as the simulators start become convinced the AI person will use their power for good over evil. Maybe even have some evil AIs from the same experiment to fight. If anyone wants to steal this idea, go for it.

Comment author: idlewire 20 July 2009 07:35:36PM 6 points [-]

Assuming I understood this correctly, you're saying an true AI might find our morality as arbitrary as we would consider pebble heap sizes, say bugger the lot of us and turn us into biomass for its nano-furnace.

In response to A Priori
Comment author: idlewire 17 July 2009 03:45:23PM 3 points [-]

Could you not argue Occam's Razor from the conjunction fallacy? The more components that are required to be true, the less likely they are all simultaneously true. Propositions with less components are therefore more likely, or does that not follow?

Comment author: idlewire 15 July 2009 07:24:26PM 1 point [-]

(Deuteronomy 13:7-11)

Talk about a successful meme strategy! No wonder we still have this religion today. It killed off its competitors.

Comment author: idlewire 15 July 2009 04:33:02PM 3 points [-]

With as scary as Anosognia sounds, we could be blocking out alien brain slugs for all we know.

Comment author: idlewire 14 July 2009 09:28:14PM 0 points [-]

We're really only just now able to identify these risks and start posing theoretical solutions to be attempted. Our ability to recognize and realistically respond to these threats is catching up. I think saying that we lack good self-preservation mechanisms is to criticize a little unfairly.

Comment author: Ben_Jones 15 September 2008 08:28:08AM 0 points [-]

a relatively dim person born into affluence in the USA has a much better time of it than a smart person born into poverty in the Congo.

Taboo 'better'. I wouldn't swap one IQ point for all the silver spoons in the world.

Comment author: idlewire 14 July 2009 09:10:27PM 20 points [-]

You wouldn't give up one IQ point for say 10 million dollars? It would be a painful decision, but I'm convinced I could have a much better effect on the world with a massive financial head start at only the slightest detriment of my intelligence. A large enough sum of money would afford me the chance to stop working and study and research the rest of my life, probably leading me to be more intelligent in the long run. Right now, I have to waste away my time with a superior level of intelligence just pay for food, shelter and student loans.

View more: Next