If there other resources that are doing something similar, please link them in the comments so I can use the information to improve the guide (with a reference). Thanks!
TIME MOVED TO 12 PM
Sorry to change it last minute but I now have plans later on Saturday I can't move so I'm shifting the meetup to an earlier time.
What are the transaction costs if you need to do 3 transactions?
1. Get the refund bonuses from the producer
2. Get the pledges from funders
3. Return the pledges + bonus if it doesn't work out
Also, will PayPal allow this type of money sending?
Congrats on getting funded way above your threshold!
Fair enough regarding Twitter
Curious what your thoughts are on my comment below
I'm talking about doing a good enough job to avoid takes like these: https://twitter.com/AI_effect_/status/1641982295841046528
50k views on the Tweet. This one tweet probably matters more than all of the Reddit comments put together
I don't find this argument convincing. I don't think Sam did a great job either but that's also because he has to be super coy about his company/plans/progress/techniques etc.
The Jordan Peterson comment was making fun of Lex and a positive comment for Sam.
Besides, I can think Sam did kinda bad and Elizier did kind of bad but expect Elizier to do much better!
.
I'm curious to know your rating on how you think Eliezer did compare to what you'd expect is possible with 80 hours of prep time including the help of close friends/co-workers.
I would rate his episode at around a 4/10
Why didn't he have a pre-prepared well thought list of convincing arguments, intuition pumps, stories, analogies, etc. that would be easy to engage with for a semi-informed listener? He was clearly grasping for them on the spot.
Why didn't he have quotes from the top respected AI people saying things like "I don't think we have a solution for super intelligence.", "AI alignment is a serious problem", etc.
Why did he not have written notes? Seriously... why did he not prepare notes? (he could have paid someone that knows his arguments really well to prepare notes for him)
How many hours would you guess Eliezer prepared for this particular interview? (maybe you know the true answer, I'm curious)
How many friends/co-workers did Eliezer ask for help in designing great conversation topics, responses, quotes, references, etc.?
This was a 3-hour long episode consumed by millions of people. He had the mind share of ~6 million hours of human cognition and this is what he came up with? Do you rate his performance more than a 4/10?
I expect Rob Miles, Connor Leahy, or Michaël Trazzi would have done enough preparation and had a better approach, and could have done an 8+/10 job. What do you think of those 3? Or even Paul Christiano.
Eliezer should spend whatever points he has with Lex to get one of those above 4 on a future episode is my opinion.
The easiest point to make here is Yud's horrible performance on Lex's pod. It felt like no prep and brought no notes/outlines/quotes??? Literally why?
Millions of educated viewers and he doesn't prepare..... doesn't seem very rational to me. Doesn't seem like systematically winning to me.
Yud saw the risk of AGI way earlier than almost everyone and has thought a lot about it since then. He has some great takes and some mediocre takes, but all of that doesn't automatically make him a great public spokesperson!!!
He did not come off as convincing, helpful, kind, interesting, well-reasoned, humble, very smart, etc.
To me, he came off as somewhat out of touch, arrogant, weird, anxious, scared, etc. (to the average person that has never heard of Yud before the Lex pod)
Toby and Elon did today what I was literally suggesting: https://twitter.com/tobyordoxford/status/1627414519784910849
@starship006, @Zack_M_Davis, @lc, @Nate Showell do you all disagree with Toby's tweet?
Should the EA and Rationality movement not signal-boost Toby's tweet?
Elon further signal boosts Toby's post
hmm...
"It is not prosocial to maximize personal flourishing under these circumstances."
I don't think this guide is at all trying to maximize personal flourishing at the cost of the communal.
It's actually very easy, quick, and cheap to follow the suggestions to up your personal welfare. If society was going to go through a bumpy patch I would want more reasonable, prepared, and thoughtful people to help steer humanity through and make it to the other side
None of the ideas I suggested would hurt communal well being, either.
I feel like it's a bit harsh to say "people shouldn't care about the most likely ways they could personally die, so I will downvote this post to make sure fewer people understand their main sources of risk."