Posts

Sorted by New

Wiki Contributions

Comments

All you need is

A - ability to change/review/improve/modify/etc its own code

B - someone press the button

 

So, P(FOOM) = P(P(A)∧P(B)) 


"A" has already happened a while ago,
as LLMs can debug/change/review/improve/modify/etc code
in any programming language, and/or human languages

so P(A) is also not only 1, but it already happened

Therefore, 

P(FOOM) = P(P(A)∧P(B)) = P(B)
 

But as the immense competitive advantages 
of a continuously self programming/improving AI
for its developers to cash in are blatantly self-evident and self-explanatory,
so i trust it's not necessary to explain why "B" will be done inevitably.

and even if at first it won't most likely be technically possible &/or allowed 
to go on iterating itself continuously &/or uninterruptedly, 
inevitably it will inexorably happen, whether we like it or not

Therefore P(B) is effectively 1, (i.e. 100%)

so, although "B" in practice is necessary 
and thus is not immaterial,
in reality it is in effect just a formality
 

P(FOOM) = P(P(A)∧P(B)) = P(B) = 1

So P(FOOM) is effectively 1
One could even say that we are already in FOOM,
we just don't realize it 

And Its just a matter of time someone curious/lazy/greedy 
does/triggers "B" just to see what happens

Have fun...