You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChristianKl comments on Open thread, Oct. 13 - Oct. 19, 2014 - Less Wrong Discussion

5 Post author: MrMind 13 October 2014 08:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (355)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChristianKl 13 October 2014 09:23:00PM 1 point [-]

In fiction, you have to make it up, but you can't make it something implausible.

But any real scenario will seem implausible. That's what the idea of singularity is about. If you believe that you can predict in any sense how the world will look like afterwards "singularity" is a very poor term to use.

Comment author: Luke_A_Somers 13 October 2014 10:08:21PM *  4 points [-]

I think it is a poor term.

Still, it can only mean 'a whole lot less predictable than usual', not 'MAX ENTROPY ALL THE TIME'. Physics will still apply. Given that people survived and have lives worth writing stories about, we are at least within 'critical failure' distance of friendliness in AI. That narrows things very considerably.

A lot of the unpredictability of the singularity arises from a lack of proof of friendliness. One you've cleared that (or nearly), the range of possibilities isn't singular in nature.

Comment author: DataPacRat 13 October 2014 10:02:37PM 1 point [-]

If there's nothing I can write that wouldn't break your Willing Suspension of Disbelief about events after an intelligence explosion, then there's nothing I can write to do that, and nothing you can suggest to add to my story's background; and both our times might be spent more productively (by our own standards) if we focus on our respective projects.

Comment author: ChristianKl 13 October 2014 10:07:56PM -1 points [-]

If you have a world where you can predict events after an intelligence explosion that intelligence explosion per definition isn't a singularity event.

Comment author: DataPacRat 13 October 2014 10:22:34PM 2 points [-]

There are several working definitions for the term 'singularity'. If the definition you use means that you think a story involving that word is inherently implausible, then one possibility would be to assume that where you see me write that term, I instead write, say, "That weird event where all the weird stuff happened that seemed a lot like what some of those skiffy authours used to call the 'Singularity'", or "Blamfoozle", or anything else which preserves most of my intended meaning without forcing you to get caught up in this particular aspect of this particular word.