The exception is that the Big Tech companies (Google, Amazon, Apple, Microsoft, although importantly not Facebook, seriously f*** Facebook) have essentially unlimited cash, and their funding situation changes little (if at all) based on their stock price.
I'm not sure if this is true right now. It seems like the entire tech industry is more cash-constrained than usual, given the high interest rates, layoffs throughout the industry, and fears of a coming recession.
Hi, I'm the user who asked this question. Thank you for responding!
I see your point about how an AGI would intentionally destroy humanity versus engineered bugs that only wipe us out "by accident", but that's conditional on the AGI having "destroy humanity" as a subgoal. Most likely, a typical AGI will have some mundane, neutral-to-benevolent goal like "maximize profit by running this steel factory and selling steel". Maybe the AGI can achieve that by taking over an iron mine somewhere, or taking over a country (or the world) and enslaving its citizens, or...
[W]iping out humanity is the most expensive of these options and the AGI would likely get itself destroyed while trying to do that[.]
It would be pretty easy and cheap for something much smarter than a human to kill all humans. The classic scenario is:
...A. [...] The notion of a 'superintelligence' is not that it sits around in Goldman Sachs's basement trading stocks for its corporate masters. The concrete illustration I often use is that a superintelligence asks itself what the fastest possible route is to increasing its real-world power, and then
I can't see the original comment, but this response seems really disconnected from what most people mean by "trans rights". In my experience, "trans rights" typically refers to a constellation of rights like:
It does not imply the right to murder other people, as doing so vi... (read more)