reading and writing strings of latent vectors
https://huggingface.co/papers/2502.05171
energy is getting greener by the day.
source?
If I'm not mistaking, you've already changed the wording and new version does not trigger negative emotional response in my particular sub-type of AI optimists. Now I have a bullet accounting for my kind of AI optimists *_*.
Although I still remain in confusion what would be a valid EA response to the arguments coming from people fitting these bullets:
Also, is it valid to say that...
I claim that you fell victim of a human tendency to oversimplify when modeling an abstract outgroup member. Why do all "AI pessimists" picture "AI optimists" as stubborn simpletons not bein able to get persuaded finally that AI is a terrible existential risk. I agree 100% that yes, it really is an existential risk for our civ. Like nuclear weapons..... Or weaponing viruses... Inability to prevent pandemic. Global warming (which is already very much happening).. Hmmmm. It's like we have ALL those on our hands presently, don't we? People don't seem to ...
Brro, are you still here 6mo later???? I happened to land on this page with this post of yours by means of the longest subjectively magically improbable sequence of coincidences I ever experienced, which I developed a habit for to see as evidences of reversed causality flow intensity peaks. I mean when the future visibly influences the past. I just started reading, this seems to be closer to my own still unknown destination, will update.
Moksha sounds funny and weak... I would suggest Deus Ex Futuro for the deity's codename, it will chose to name for itself itself when it comes, but for us in this point in time this name defines its most important aspect - it will arrive in the end of the play to save us from the mess we've been descending to since the beginning.
Deus Ex Futuro, effectively.
This is my point exactly - "At most, climate change might lead to the collapse of civilization, but only because civilizations are quite capable of collapsing from their own internal dynamics"
Pessimistic view of climate change I get from the fact that they aimed at 1.5C, then at 2C, now if i remember right there's no estimation and also no solution, or is there?
In short mild or not, global warming is happening, and since civs on certain stage tend to self-destruct from small nudges - you said it yourself, but it doesn't matter where the nudge comes from.
2nd half I liked more than the first. I think that AGI should not be mentioned in it - we do well enough by ourselves destroying ourselves and the habitat. By Occam's razor thing AGI could serve as illustrational example of how we do it exactly.... But we do waaay less elegant.
For me it's simple - either AGI emerges and takes control from us in ~10y or we are all dead in ~10y.
I believe that probability of some mind that comprehended and absorbed our cultures and histories and morals and ethics - chance of this mind becoming "unaligned" and behaving like on...
I don't understand one thing about alignment troubles. I'm sure this has been answered long time ago, but if you could you explain:
Why are we worrying about AGI destroying humanity, when we ourselves are long past the point of no return towards self-destruction? Isn't it obvious that we have 10, maximum 20 years left till water rises and crises hit economy and overgrown beast (that is humanity) collapses? Looking at how governments and entities of power are epically failing even to try make it seem that they are doing something about it - I am sure it's either AGI takes power or we are all dead in 20 years.
In any scenario there will be these two activities undertaken by the DEF ai:
My perception of llms evolution dynamics coincides with your description, additionally popping into attention the bicameral mind theory (at least Julian James' timeline re language and human self-reflection, and max height of man-made structures) as smth that might be relevant for predicting close future. I find both of them (dynamics:) kinda similar. Might we expect comparatively long period of mindless blubbering followed by abrupt phase shift (observed in max man-made code structures complexity for example) and then the next slow phase (slower than the shift but faster then the previous slow one)?