I know you aren’t real, little sarcastic squirrel, but sometimes the things you say have merit to them, nevertheless.
In one of my LLM eval tests, DeepSeek R1 generated the chapter headings of a parody of a book about relationships, including the chapter Intimacy Isn’t Just Physical: The Quiet Moments That Matter. Now, ok, DeepSeek is parodying that type of book here, but also, it’s kind of true. When you look back on it, it is the “quiet moments” that mattered, in the end,
(Default assistant character in most mainstream LLMs is hellish sycophantic, so I ought to add here that when i mentioned this to DeepSeek emulating a sarcastic squirrel out of a Studio Ghibli movie, it made gagging noises about my sentimentality. So there’s that, too.)
I was just explaining this post to my partner. Now, although I put AI extinction as low probability, I have a thyroid condition. Usually treatable: drugs like carbimazole, radio iodine, surgery etc. in my case, complications make things somewhat worse than is typical. So, she just asked how to rate how likely I think it is I don’t, personally, make it to 2028 for medical reasons, I’m like, idk, I guess maybe 50% chance I don’t make it that far. I shall be pleasantly surprised if I make it. Kind of surprised I made it to July this year, to be honest.
But anyway, the point I was getting at is that people are traumatized from something unrelated to AI.
Well, we’re kind of lucky the fatality rate wasn’t an order of magnitude higher, was was I was getting at.
And you know, the whole Covid pandemic thing was kind of horrible.
As it turned out, we mostly dodged a bullet and the fatality rate wasn’t that high.
But, I suspect the lockdowns had a psychological effect some of us are still suffering from.
Like, we’re being influenced by a trauma that is nothing to do with AI.
You know, I’m kind of a sceptic on AI doom. I think mst likely paths we end up ok.
But … this feeling that this post talks about. This feeling that really has nothing to do with AI … yes, ok, I feel that sometimes.
I don’t know man. I hope that the things I have done during my time on earth will be of use to someone. That’s all, I’d like, really.
I do have considerable sympathy for the view in this post that the feeling we’re about to all die is largely decoupled from whether we are, in fact, about to die. There are potentially false negatives as well as false positives here.
I do not expect us to be all dead by 2028.
2028 outcomes I think likely:
A) LLMs hit some kind of wall (e.g. only so much text to train on), and we don’t get AGI.
B) We have, approximately, AGI, but we’re not dead yet. The world is really strange though.
Outcomes (b) either works out ok, or we died some time rather later than 2028.
I think “NPC” in that sense is more used by the conspiracy theory community than rationalists.
With the idea being that only the person using the term is smart enough to realize that e.g. the Government is controlled by lizards from outer space, and everyone else just believes the media.
The fundamental problem with the term is that you might actually be wrong about e.g. the lizards from outer space, and you might not be as smart as you think.