Some of the memes you referenced do seem "cringe" to me, but people have different senses of humor. I'm not sure what the issue is with someone posting memes they personally find funny.
If you disagree with the point that the memes are making, that's different, but can you give an example of something in one of the memes she posted that you thought was invalid reasoning? You called her content "dark arts tactics" and said:
"It feels like it is trying to convince me of something rather than make me smarter about something. It feels like it is trying to convey feelings at me rather than facts."
but you've only explained how it's making you feel instead of what message it's conveying.
Huh. I first heard of Greg Egan in the context of Eliezer mentioning him as a SF writer who he liked, iirc. Kind of ironic he ended up here.
What's the b word?
I still think it was an interesting concept, but I'm not sure how deserving of praise this is since I never actually got beyond organizing two games.
He said it was him on Joe Rogan's podcast.
you find some pretty ironic things when rereading 17-year-old blog posts, but this one takes the cake.
If you look over all possible worlds, then asking "did the coin come up Heads or Tails" as if there's only one answer is incoherent. If you look over all possible worlds, there's a ~100% chance the coin comes up as Heads in at least one world, and a ~100% chance the coin comes up as Tails in at least one world.
But from the perspective of a particular observer, the question they're trying to answer is a question of indexical uncertainty - out of all the observers in their situation, how many of them are in Heads-worlds, and how many of them are in Tails-worlds? It's true that there are equally as many Heads-worlds as Tails-worlds - but 2/3 of observers are in the latter worlds.
Or to put it another way - suppose you put 10 people in one house, and 20 people in another house. A given person should estimate a 1/3 chance that they're in the first house - and the fact that 1 house is half of 2 houses is completely irrelevant. Why should this reasoning be any different just because we're talking about possible universes rather than houses?
I think you're overestimating the intended scope of this post. Eliezer's argument involves multiple claims - A, we'll create ASI; B, it won't terminally value us; C, it will kill us. As such, people have many different arguments against it. This post is about addressing a specific "B doesn't actually imply C" counterargument, so it's not even discussing "B isn't true in the first place" counterarguments.
While you're quite right about numbers on the scale of billions or trillions, I don't think it makes sense in the limit for the prior probability of X people existing in the world to fall faster than X grows in size.
Certain series of large numbers grow larger much faster than they grow in complexity. A program that returns 10^(10^(10^10)) takes fewer bits to specify (relative to most reasonable systems of specifying programs) than a program that returns 32758932523657923658936180532035892630581608956901628906849561908236520958326051861018956109328631298061259863298326379326013327851098368965026592086190862390125670192358031278018273063587236832763053870032004364702101004310417647840155719238569120561329853619283561298215693286953190539832693826325980569123856910536312892639082369382562039635910965389032698312569023865938615338298392306583192365981036198536932862390326919328369856390218365991836501590931685390659103658916392090356835906398269120625190856983206532903618936398561980569325698312650389253839527983752938579283589237325987329382571092301928* - even though 10^(10^(10^10)) is by far the larger number. And it only takes a linear increase in complexity to make it 10^(10^(10^(10^(10^(10^10))))) instead.
*I produced this number via keyboard-mashing; it's not anything special.
Consider the proposition "A superpowered entity capable of creating unlimited numbers of people ran a program that output the result of a random program out of all possible programs (with their outputs rendered as integers), weighted by the complexity of those programs, and then created that many people."
If this happened, the probability that their program outputs at least X would fall much slower than X rises, in the limit. The sum doesn't converge at all; the expected number of people created would be literally infinite.
So as long as you assign greater than literally zero probability to that proposition - and there's no such thing as zero probability - there must exist some number X such that you assign greater than 1/X probability to X people existing. In fact, there must exist some number X such that you assign greater than 1/X probability to X million people existing, or X billion, or so on.
(btw, I don't think that the sort of SIA-based reasoning here is actually valid - but if it was, then yeah, it implies that there are infinite people.)
wow I wouldn't have expected LessWrongers' long-suppressed sexual instincts to be crypto scams - no, you know what, if anyone got turned on by crypto scams it would probably be us.
(more seriously: the link is broken.)