There are so many theories about how future AIs will take over the world, when in the real world software is evil because it DOESN'T WORK.
It takes almost a minute to open a damn txt file on my Windows 10 desktop PC. Things were actually slightly faster in Windows 95. Websites keep loading slower and slower. Bloatware keeps expanding for less functionality. Click and click and click and nothing happens. 


Software isn't evil because it evolves beyond the user's goals toward alien goals, but because the computer programmers goals are opposed to the user's goals. Microsoft hired every devil in hell to work overtime to make it keep crashing and freezing while unstoppably downloading more malware. 
I cannot describe in words how hypersatanically evil the world's programmers are, malevolent, diabolical, demonic (that last one not literally). It's been going on for decades, the constant problem of my daily life.
My Pixma scanner is held hostage by the shitbastards at Canon corporation because an attached ink cartridge apparently expired. My h2owireless cellphone sim card stopped working with $7.50 on account and they won't respond in any way. There is NEVER NEVER NEVER any useful info on their websites.


I understand people on this site are deeply worried that future software, instead of being able to do almost nothing for the user, will suddenly be able to do everything for itself. 
But everything goes wrong so badly that even things going wrong will go wrong. 
People probably WILL use AIs eventually to invent designer neurotoxins or interacting retroviruses or nano shrapnel or something. 

New Comment
5 comments, sorted by Click to highlight new comments since:

Seems misleading to me because

  • ImE, the premise is untrue. My Windows crashes much more rarely than it used to, my PC boots much faster, most websites I use work more smoothly than they used to, etc.

  • I suspect observations about regular software aren't all that relevant for Machine Learning.

I don't think that your experience with consumer software is at all similar to Engineer/scientist/modeler experience with large language and predictive models.  To be clear, those suck too, and require an insane amount of effort and thought to get anything useful done.  But they suck in ways that can be fixed over time, and in ways that (seem to) correlate with the underlying complexity of the world.

Complaining about corporate decisions that happen to be implemented in software doesn't quite connect.  At least by that pathway.  Worrying that consumer software usually seems adversarial to the consumer, and that there may be a similar problem where AI is adversarial to everyone but the "owner" of the AI is probably justified.

But that's not "software sucks", it's "software creators are a mix of evil and stupid".  

Yes, but it does show a tendency of huge complex networks (operating system userbases, the internet, human civilization) to rapidly converge to a fixed level of crappiness that absolutely won't improve, even as more resources become available.
Of course there could be a sudden transition to a new state with artificial networks larger than the above.

The past week my Windows 10 box has been almost unusable as it spent the days wasting kilowatts and processing cycles downloading worse-than-useless malware "updates" with no way to turn them off! 

Evil is the most fundamental truth of the world. The Singularity cannot happen soon enough . . .

I just spent four hours trying to get a new cellphone to work (that others insist I should have), and failed totally.

There is something fantastically wrong with this shitplanet, but completely different than anyone is willing to talk about.