Would be cool if one of the items was a nugget of "computation fuel" that could be used to allow a robot's register machine to run for extra steps. Or maybe just items whose proximity gives a robot extra computation steps. That way you could illustrate situations involving robots with quantitatively different levels of "intelligence". Could lead to some interesting strategies if you run programming competitions on this too, like worker robots carrying fuel to a mother brain.
Do you have thoughts on whether it's safe for a beginner to lift weights without in-person instruction? From what I hear, even small mistakes in form can cause injury, especially when adding weight quickly like a beginner will do. Is it worth the risk to try and learn good form from only books and videos? My friend attempted Starting Strenght for a month, got a pain in their knee and had to quit, and hasn't been able to get back into it because finding personal instruction is a huge hassle especially if one isn't willing to pay a lot. Should they try again by themselves and just study those books and videos extra closely?
I can never understand why the idea that replicating systems might just never expand past small islands of clement circumstances (like, say, the surface of the Earth) gets so readily dismissed in these parts.
People in these parts don't necessarily have in mind the spread of biological replicators. Spreading almost any kind of computing machinery would be good enough to count, because it could host simulations of humans or other worthwhile intelligent life.
(Note that that question of whether simulated people are actually conscious is not that relevant to...
Anything that's just a trivial inconvenience definitely won't protect you from the NSA and probably won't even protect you from random internet people looking to ruin your life/reputation for fun.
The general impression I got from reading a lot of the stuff that gets posted in the various tulpa communities leads me to believe it is, at its core, yet another group of people who gain status within that group by trying to impress each other with how different or special their situation is.
Used to be, when I read stories about "astral projection" I thought people were just imagining stuff really hard and then making up exaggerated stories to impress each other. Then I found out it's basically the same thing as wake initated lucid dreaming,...
Please consider sending some Bitcoins to SI at address 1HUrNJfVFwQkbuMXwiPxSQcpyr3ktn1wc9
https://blockchain.info/address/1HUrNJfVFwQkbuMXwiPxSQcpyr3ktn1wc9:
Total Received 343.91998333 BTC
Final Balance 5.55939055 BTC
Thanks, this looks to be a good summary of what I'm not missing :)
In a way every game is a rationality game, because in almost every game you have to discover things, predict things, etc. In another way almost no game is one, because domain-specific strategies and skills win out over general ones.
One idea is based on the claim that general rationality skills matter more when it's a fresh new game that nobody has played yet, since then you have to use your general thinking skills to reason about things in the game and to invent game-speficic strategies. So what if there were "mystery game" competitions where th...
You can say that whether it's signaling is determined by the motivations of the person taking the course, or the motivations of the people offering the course, or the motivations of employers hiring graduates of the course. And you can define motivation as the conscious reasons people have in their minds, or as the answer to the question of whether the person would still have taken the course if it was otherwise identical but provided no signaling benefit. And there can be multiple motivations, so you can say that something is signaling if signaling is one...
Clearly, a training course for, say, a truck driver, is not signalling, but exactly what it says on the can
If there was a glut of trained truck drivers on the market and someone needed to recruit new crane operators, they could choose to recruit only truck drivers because having passed the truck driving course would signal that you can learn to operate heavy machinery reliably, even if nothing you learned in the truck driving course was of any value in operating cranes.
I'm pretty sure he is calling into question the claim that "it was dangerous to question that the existence of God could be proven through reason", which was a very common belief throughout most of the middle ages and was held with very little danger as far as I can tell
...
This doctrine was supposed (though we don't know if correctly) to be a doctrine that although reason dictated truths contrary to faith, people are obliged to believe on Faith anyway. It was supressed.
I agreed with this at first, but actually, no. Belief in the supernatural doesn't require belief in gods, spirits or any non-human agents. You could just believe that humans have some supernatural abilities like reading each other's minds. When trying to explain these abilties, only reductionists will conclude that there's some third party agent like a simulator setting things up. Non-reductionists will just accept that being able to read minds is part of how this ontologically fundamental mind stuff works.
"applies to itself". Better have only one of that card per deck.
Card:
quined is true
Better have only one per deck.
Actually because the zombie uploads are capable of all the same reasoning as M_P, they will figure out that they're not conscious, and replace themselves with biological humans.
On the other hand, maybe they'll discover that biological humans aren't conscious either, they just say they are for reasons that are causally isomorphic to the reasons for which the uploads initially thought they were conscious, and then they'll set out to find a substrate that really allows for consciousness.
Not polyphasic but

Thanks for the link. I don't really see creepy cult isolation in that discussion, and I think most people wouldn't, but that's just my intuitive judgment.
Really? Links? A lot of stuff here is a bit too culty for my tastes, or just embarassing, but "cutting family ties with nonrational family members"?? I haven't been following LW closely for a while now so I may have missed it, but that doesn't sound accurate.
Agreed. Of course the thing about means and ends is that you can always frame the situation in two opposing ways:
Way 1: Eating factory farmed meat and not worrying about it in order to better focus on third world donations is the same as making the following means-end tradeoff:
Way 2: Avoiding meat in order to not support factory farming despite the fact that such avoiding causes costs* that lessen the effectiveness of your EA activities is the same as making the following means-end tradeoff:
... (read more)