bugsbycarlin

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

The truly interesting thing here is that I would agree unequivocally with you if you were talking about any other kind of 'cult of the apocalypse'.

 

This has Arrested Development energy ^_^ https://pbs.twimg.com/media/FUHfiS7X0AAe-XD.jpg 

 

The personal consequences are there. The're staring you in the face with every job in translation, customer service, design, transportation, logistics, that gets automated in such a way that there is no value you can possibly add to it

...

2-3 Years ago I was on track to becoming a pretty good illustrator, and that would have been a career I would have loved to pursue. When I saw the progress AI was making in that area - and I was honest with myself about this quite a bit earlier than other people, who are still going through the bargaining stages now -, I was disoriented and terrified in a way quite different from the 'game' of worrying about some abstract, far-away threat

This is the thing to worry about.  There are real negative consequences to machine learning today, sitting inside the real negative consequences of software's dominance, and we can't stop the flat fact that a life of work is going away for most people. The death cult vibe is the wild leap. It does not follow that AI is going to magically gain the power to gain the power to gain the power to kill humanity faster than we can stop disasters.


 

A helpful tool on the way to landing and getting sober is exercise. Exercise is essentially a displacement, like any of the other addictions, but it has the unique and useful feature that it processes out your chemicals, leaving you with less stress chemicals in circulation, and a refractory period before your body can make more.

Almost no matter your physical capabilities, there is something you can go do that makes you sweat and tires you out... and breaks the stress-focus-stress-focus cycle.

 

Edit: btw, this is great stuff, very good for this community to name it and offer a path away.

Related, but addressing a very different side of the AI risk mindset: https://idlewords.com/talks/superintelligence.htm