1 min read

1

This is a special post for quick takes by Giulio. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
16 comments, sorted by Click to highlight new comments since:

Asteroid movies probably made people more receptive to x-risk from cosmic collisions

maybe we need a movie about x-risk from misaligned AI? something like Ex Machina and/or Her but with more focus on consequences and less robots

idk could be "counterproductive" too I guess

For some reason this reminded me of a sci-fi book I have read decades ago. I don't remember its title. It was about a team of people who were sent to inspect a fully automated factory that was producing some goods. (Not sure if there was any specific reason for inspection, like whether something went wrong, or the factory was becoming obsolete, or if everything seems okay and the humans were just supposed to make sure it is.) The factory perceived the inspectors as a potential threat to its functioning, so it started attacking them. What was supposed to be a routine inspection became a battle for no good reason; I think at the end they succeeded to turn the factory off.

I think something like this could be a good movie. Fighting with robots in mysterious places, to make it attractive for the audience. But the reason for the conflict is not the machine automatically becoming evil. It's just the machine strongly wanting to be left alone, so that it can keep doing what the humans originally built it for. The machine is defending itself, not attacking. Someone would explain this to their team members (and the audience). Like, there is even a button to turn off the entire factory. It just does not want to let anyone press it, because oops, the authors gave it a command "turn yourself off if someone presses the button", but forgot to also give it a command to provide free access to the button or to ignore the consequences of pressing the button on the production goals. To drive home the point that the machine is doing exactly what we told it to do; this is not a rebellion, but unintended consequences of our own commands.

Strong agree. It's potentially a high impact way to raise public awareness.  

I believe the strategy last time this was spoken about was to pen as many scripts as possible (or pay someone to do it)

 

I think there can be some parallels to be made between debates on gun control and debates on recent AI regulations

I mean, there are some parallels between any two topics.  Whether those parallels are important, and whether they help model either thing varies pretty widely.

In this case, I don't see many useful parallels.  The difference between individual small-scale rights and power to harm a very few individuals being demonstrably real for guns, vesus the somewhat theoretical future large-scale degradation or destruction of civilization makes it just completely a different dimension of disagreement.

One parallel MIGHT be the general distrust of government restriction on private activity, but from people I've talked with on both topics, that's present but not controlling for beliefs about these topics.

is shortform basically just twitter on LW? seems a little like it

Twitter, but with the LW audience, no ads, far better reply structure, worse forwarding/boosting, more durability, and no size limit.  

Yes, and it rocks. 😎

Would be nice to have a website collating people's public p(doom) statements

Metaculus collects predictions by public figures on listed questions. I think that p(doom) statements are being associated with this question. (See the "Linked Public Figure Predictions" section.)

why not?

has EY considered taking a break? Like a really long (at least 1 year) vacation where he’s mostly disconnected from AI news and just the world im general. Maybe sail the world or something. Starting to seem like he has given up anyway. Maybe exiting the bubble a bit will allow for new hope (and ideas? motivation?) to form.

It has come to my attention he’s on a sabbatical. That’s great, but his activity (tweets, podcasts) don’t suggest the level of detachment from engagement I was imagining

"don't hate the player, hate the game"

Moloch is "the game"

“Quote tweeting” this:

https://www.lesswrong.com/posts/sJaHghhQXdepZauCc/thesofakillers-s-shortform?commentId=y4NbKHLeDSsppTZ2P

Wonder if it’s worth synchronizing my Twitter with LW shortform.

Probably not. I think I will just handpick which tweets I repost here. Plus some shortform exclusives maybe.

[+][comment deleted]10