You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChristianKl comments on Open thread, Oct. 13 - Oct. 19, 2014 - Less Wrong Discussion

5 Post author: MrMind 13 October 2014 08:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (355)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChristianKl 13 October 2014 03:05:46PM -2 points [-]

The whole point of the concept of singularity is that we don't know what will happen afterwards.

Comment author: Luke_A_Somers 13 October 2014 03:51:47PM 6 points [-]

Some things, however, are less plausible than others.

In fiction, you have to make it up, but you can't make it something implausible.

Comment author: ChristianKl 13 October 2014 09:23:00PM 1 point [-]

In fiction, you have to make it up, but you can't make it something implausible.

But any real scenario will seem implausible. That's what the idea of singularity is about. If you believe that you can predict in any sense how the world will look like afterwards "singularity" is a very poor term to use.

Comment author: Luke_A_Somers 13 October 2014 10:08:21PM *  4 points [-]

I think it is a poor term.

Still, it can only mean 'a whole lot less predictable than usual', not 'MAX ENTROPY ALL THE TIME'. Physics will still apply. Given that people survived and have lives worth writing stories about, we are at least within 'critical failure' distance of friendliness in AI. That narrows things very considerably.

A lot of the unpredictability of the singularity arises from a lack of proof of friendliness. One you've cleared that (or nearly), the range of possibilities isn't singular in nature.

Comment author: DataPacRat 13 October 2014 10:02:37PM 1 point [-]

If there's nothing I can write that wouldn't break your Willing Suspension of Disbelief about events after an intelligence explosion, then there's nothing I can write to do that, and nothing you can suggest to add to my story's background; and both our times might be spent more productively (by our own standards) if we focus on our respective projects.

Comment author: ChristianKl 13 October 2014 10:07:56PM -1 points [-]

If you have a world where you can predict events after an intelligence explosion that intelligence explosion per definition isn't a singularity event.

Comment author: DataPacRat 13 October 2014 10:22:34PM 2 points [-]

There are several working definitions for the term 'singularity'. If the definition you use means that you think a story involving that word is inherently implausible, then one possibility would be to assume that where you see me write that term, I instead write, say, "That weird event where all the weird stuff happened that seemed a lot like what some of those skiffy authours used to call the 'Singularity'", or "Blamfoozle", or anything else which preserves most of my intended meaning without forcing you to get caught up in this particular aspect of this particular word.

Comment author: James_Miller 13 October 2014 04:42:36PM 3 points [-]

With high probability we do, unfortunately.

Comment author: ChristianKl 13 October 2014 09:22:08PM 1 point [-]

With high probability there won't be any humans afterwards but that doesn't tell you how the world would look like.

Comment author: James_Miller 13 October 2014 10:45:10PM 2 points [-]

Disagree, since over 99% of what I care about would be the same across all post-singularity states that lack lifeforms I care about. Analogously, if I knew that tomorrow I would be killed and have some randomly selected number written on my chest I would believe that today I knew everything important about my personal future.

Comment author: ChristianKl 14 October 2014 02:12:22PM 1 point [-]

If you want to tell a story about that would, than you need to know something about how the world looks like besides "there are no humans".

Comment author: gjm 13 October 2014 03:49:57PM 3 points [-]

We also don't know what will have happened by 200 years from now (singularity or no singularity), but that is no obstacle to writing science fiction set 200 years in the future.