This website requires javascript to properly function. Consider activating javascript to get access to all site functionality.
LESSWRONG
Tags
LW
Login
Intelligence Explosion
•
Applied to
The Evolution of Humans Was Net-Negative for Human Values
by
Zack_M_Davis
1mo
ago
•
Applied to
What is the nature of humans general intelligence and it's implications for AGI?
by
Will_Pearson
1mo
ago
•
Applied to
Carl Shulman On Dwarkesh Podcast June 2023
by
Moonicker
3mo
ago
•
Applied to
A thought experiment for comparing "biological" vs "digital" intelligence increase/explosion
by
Super AGI
3mo
ago
•
Applied to
AGI will be made of heterogeneous components, Transformer and Selective SSM blocks will be among them
by
Roman Leventov
4mo
ago
•
Applied to
LLMs May Find It Hard to FOOM
by
RogerDearnaley
6mo
ago
•
Applied to
A Simple Theory Of Consciousness
by
SherlockHolmes
9mo
ago
•
Applied to
How Smart Are Humans?
by
Joar Skalse
10mo
ago
•
Applied to
Do not miss the cutoff for immortality! There is a probability that you will live forever as an immortal superintelligent being and you can increase your odds by convincing others to make achieving the technological singularity as quickly and safely as possible the collective goal/project of all of humanity, Similar to "Fable of the Dragon-Tyrant."
by
Oliver--Klozoff
10mo
ago
•
Applied to
Carl Shulman on The Lunar Society (7 hour, two-part podcast)
by
ESRogs
10mo
ago
•
Applied to
What is Intelligence?
by
IsaacRosedale
1y
ago
•
Applied to
A basic mathematical structure of intelligence
by
Golol
1y
ago
•
Applied to
A method for empirical back-testing of AI's ability to self-improve
by
Michael Tontchev
1y
ago
•
Applied to
Why I'm Sceptical of Foom
by
DragonGod
1y
ago
•
Applied to
Power-Seeking AI and Existential Risk
by
Antonio Franca
2y
ago
•
Applied to
Towards a Formalisation of Returns on Cognitive Reinvestment (Part 1)
by
DragonGod
2y
ago
•
Applied to
The Hard Intelligence Hypothesis and Its Bearing on Succession Induced Foom
by
DragonGod
2y
ago