Thinking about responsible gambling, something like up-front long-term commitment should solve a lot of problems? You have to decide right away and lock up money you going to spend this month and that will separate decision from impulse to spend.
I tried to derive it, turned out to be easy: BC is wheel pair, CD is surface, slow medium above. AC/Vfast=AB/Vslow and for critical angle D touches small circle (inner wheel is on the verge of getting out of medium) so ACD is right triangle, so AC*sin (ACD)= AD (and AD same as AB) so sin(ACD) = AB/AC= Vslow/Vfast. Checking wiki it is the same angle (BC here is wavefront so velocity vector is normal to it). Honestly I am a bit surprised this analogy works so well.
I read about better analogy long time ago: use two wheels on an axle instead of single ball, then refraction come out naturally. Also I think instead of difference in friction it is better to use difference in elevation, so things slow down when they go to an area of higher elevation and speed back up going down.
It is defecting against cooperate-bot.
From ASI standpoint humans are type of rocks. Not capable of negotiating.
This experience-based primitivity also means inter-temporal self identification only goes one way. Since there is no access to subjective experience from the future, I cannot directly identify which/who would be my future self. I can only say which person is me in the past, as I have the memory of experiencing from its perspective.
While there is large difference in practice between recalling past event and anticipating future event, on conceptual level there is no meaningful difference. You don't have direct access to past events, memory is just an especially simple and reliable case of inference.
Would be funny if hurdle presented by tokenization is somehow responsible for LLM being smarter than expected :) Sounds exactly like kind of curveball reality likes to throw at us from time to time :)
But to regard these as a series of isolated accidents is, I think, not warranted by the number of events which they all seem to point in mysteriously a similar direction. My own sense is more that there are strange and immense and terrible forces behind the Poverty Equilibrium.
Reminded me of The Hero With A Thousand Chances
May be societies with less poverty are less competitive
e^3 is ~20, so for large n you get 95% of success by doing 3n attempts.