All of Artir's Comments + Replies

Artir10

The asteroid case - it wouldn't be inevitable; it's just the knowledge that there are people out there substantially more motivated than me (and better positioned) to deal with it. For some activities where I'm really good (like... writing blogposts) and where I expect my actions to make more of an impact relative to what others would be doing I could end up writing a blogpost about 'what you guy should do' and emailing it to some other relevant people.

 

Also, you can edit your post accordingly to reflect my update!

1Grant Demaree
Updated! Excuse the delay
Artir10

See here: https://www.lesswrong.com/posts/RsDwRmHGvf6GqaQkE/why-so-little-ai-risk-on-rationalist-adjacent-blogs?commentId=rumGEbYYnHBcRxx6c

Artir50

Hi, I'm the author of Nintil.com. As of today I think the endorsement I gave to Yarvin's argument was too strong, and I have just amended the post to make that clear. I added the following:

 

[Edit 2022-06-14]: I think some overall points in Yarvin's essay are valid (the world is indeed uncertain and there are diminishing returns to intelligence), but AGIs would still have the advantage of speed and parallelism (Imagine the entirety of Google but no need for meetings, and where workweeks are ran at 100x speed).  Even in the absence of superior inte... (read more)

1Grant Demaree
Many thanks for the update… and if it’s true that you could write the very best primer, that sounds like a high value activity I don’t understand the astroid analogy though. Does this assume the impact is inevitable? If so, I agree with taking no action. But in any other case, doing everything you can to prevent it seems like the single most important way to spend your days