Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
tilek10

AI-Caused Extinction Ingredients

Below is what I see is required for AI-Caused Extinction to happen in the next few tens of years (years 2024-2050 or so). In brackets is my very approximate probability estimation as of 2024-07-25 assuming all previous steps have happened.

  1. AI technologies continue to develop at approximately current speeds or faster (80%)
  2. AI manages to reach a level where it can cause an extinction (90%)
  3. AI that can cause an extinction did not have enough alignment mechanisms in place (90%)
  4. AI executes an unaligned scenario (low, maybe less than 10%)
  5. Other AIs and humans aren't able to notice and stop the unaligned scenario in time (50-50ish)
  6. Once the scenario is executed humanity is never able to roll it back (50-50ish)
tilek10

"AI will never be smarter than my dad."

 

I believe ranked comparing intelligence between two artificial or biological agents can only be down subjectively with someone deciding what they value.

Additionally, I think there is no agreed upon whether the definition "intelligence" should include knowledge. For example, can you consider an AI "smart" if it doesn't know anything about humans?

On the other hand, I value my dad's ability to have knowledge about my childhood and have a model of my behavior across tens of years very highly.  Thus, I will never agree that AI is smarter than my dad I will only agree that AI is better at certain cognitive skills while my dad is better at certain other cognitive skills even if some of those skills only requires a simple memory lookup.

Whether certain relatively general AI will be better at learning a random set of cognitive tasks than my dad is a different question, if it will be then I will admit that it's better at certain or maybe all known generality benchmarks but only I can decide what cognitive skills I value for myself.

tilek10

I see two related fundamental problems with the modern discourse around AI.

1) As with most words, there is no agreed upon definition on the term "intelligence".

2) Intelligence is often used in a ranked comparison as a single dimension, e.g. "AI smarter than a human".

When people use the word "intelligence" it seems people often assume it should include various analytical, problem skills, and learning skills. What's less clear if it includes creative skills, communication skills, emotional intelligence, etc.

I think because people often like simplifying concepts and ranking people against each other, term "inteligence" started to be used as a single dimension - "your son is the smartest child in class", "my boyfriend is much smarter than hers", and of course "we will soon reach AI that is smarter than a human".

I believe this type of thinking lead to the development and popularization of the IQ score at some point. The IQ score seems to be mainly out of the discource between thought leaders.

I believe in place of the term "intelligence" a less wrong and more precise & useful term for the discource should be "cognitive skills", it much better represents that the concept involves multiple subjective dimensions rather than a single objective dimension.

This way we will more clearly evaluate various AIs by their various cognitive skills and not fall into a false sense that "intelligence" is a single clear dimension.