This website requires javascript to properly function. Consider activating javascript to get access to all site functionality.
LESSWRONG
LW
Login
Eliezer Yudkowsky's
Shortform
by
Eliezer Yudkowsky
1st Apr 2023
AI Alignment Forum
1 min read
0
14
Ω 6
This is a special post for quick takes by
Eliezer Yudkowsky
. Only they can create top-level comments. Comments here also appear on the
Quick Takes page
and
All Posts page
.
Moderation Log
More from
Eliezer Yudkowsky
316
Universal Basic Income and Poverty
Eliezer Yudkowsky
8mo
137
207
The Sun is big, but superintelligences will not spare Earth a little sunlight
Eliezer Yudkowsky
6mo
142
928
AGI Ruin: A List of Lethalities
Ω
Eliezer Yudkowsky
3y
Ω
708
View more
Curated and popular this week
56
AI for AI safety
Ω
Joe Carlsmith
2d
Ω
4
158
On the Rationality of Deterring ASI
Ω
Dan H
5d
Ω
30
291
Policy for LLM Writing on LessWrong
jimrandomh
3d
48
0
Comments