wedrifid comments on Our Phyg Is Not Exclusive Enough - Less Wrong

25 [deleted] 14 April 2012 09:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (513)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vaniver 15 April 2012 06:24:05AM 1 point [-]

Beliefs are only sometimes about anticipation. LessWrong repeatedly makes huge errors when they interpret "belief" in such a naive fashion

Can you give examples of beliefs that aren't about anticipation?

Comment author: wedrifid 15 April 2012 06:59:54AM *  8 points [-]

Can you give examples of beliefs that aren't about anticipation?

Beliefs about things that are outside our future light cone possibly qualify, to the extent that the beliefs don't relate to things that leave historical footprints. If you'll pardon an extreme and trite case, I would have a belief that the guy who flew the relativistic rocket out of my light cone did not cease to exist as he passed out of that cone and also did not get eaten by a giant space monster ten minutes after. My anticipations are not constrained by beliefs about either of those possibilities.

In both cases my inability to constrain my anticipated experiences speaks to my limited ability to experience and not a limitation of the universe. The same principles of 'belief' apply even though it has incidentally fallen out of the scope which I am able to influence or verify even in principle.

Comment author: Will_Newsome 15 April 2012 10:47:32AM *  5 points [-]

Beliefs that aren't easily testable also tend to be the kind of beliefs that have a lot of political associations, and thus tend not to act like beliefs as such so much as policies. Also, even falsified beliefs tend to be summarily replaced with new untested/not-intended-to-be-tested beliefs, e.g. "communism is good" with "correctly implemented communism is good", or "whites and blacks have equal average IQ" with "whites and blacks would have equal average IQ if they'd had the same cultural privileges/disadvantages". (Apologies for the necessary political examples. Please don't use this as an opportunity to talk about communism or race.)

Many "beliefs" that aren't politically relevant—which excludes most scientific "knowledge" and much knowledge of your self, the people you know, what you want to do with your life, et cetera—are better characterized as knowledge, and not beliefs as such. The answers to questions like "do I have one hand, two hands, or three hands?" or "how do I get back to my house from my workplace?" aren't generally beliefs so much as knowledge, and in my opinion "knowledge" is not only epistemologically but cognitively-neurologically a more accurate description, though I don't really know enough about memory encoding to really back up that claim (though the difference is introspectively apparent). Either way, I still think that given our knowledge of the non-fundamental-ness of Bayes, we shouldn't try too hard to stretch Bayes-ness to fit decision problems or cognitive algorithms that Bayes wasn't meant to describe or solve, even if it's technically possible to do so.

Comment author: Eugine_Nier 15 April 2012 08:04:42PM 1 point [-]

Also, even falsified beliefs tend to be summarily replaced with new untested/not-intended-to-be-tested beliefs, e.g. "communism is good" with "correctly implemented communism is good", or "whites and blacks have equal average IQ" with "whites and blacks would have equal average IQ if they'd had the same cultural privileges/disadvantages".

I believe the common to term for that mistake is "no true Scotsman".

Comment author: Vaniver 15 April 2012 05:06:17PM *  1 point [-]

Beliefs about things that are outside our future light cone possibly qualify, to the extent that the beliefs don't relate to things that leave historical footprints. If you'll pardon an extreme and trite case, I would have a belief that the guy who flew the relativistic rocket out of my light cone did not cease to exist as he passed out of that cone and also did not get eaten by a giant space monster ten minutes after. My anticipations are not constrained by beliefs about either of those possibilities.

What do we lose by saying that doesn't count as a belief? Some consistency when we describe how our minds manipulate anticipations (because we don't separate out ones we can measure and ones we can't, but reality does separate those, and our terminology fits reality)? Something else?

Comment author: Eugine_Nier 15 April 2012 08:06:24PM 2 points [-]

So if someone you cared about is leaving your future light cone, you wouldn't care if he gets horribly tortured as soon as he's outside of it?

Comment author: Vaniver 15 April 2012 08:14:24PM 1 point [-]

I'm not clear on the relevance of caring to beliefs. I would prefer that those I care about not be tortured, but once they're out of my future light cone whatever happens to them is a sunk cost- I don't see what I (or they) get from my preferring or believing things about them.

Comment author: Eugine_Nier 15 April 2012 09:17:13PM 1 point [-]

Yes, but you can affect what happens to them before they leave.

Comment author: Vaniver 15 April 2012 09:37:23PM *  1 point [-]

Before they leave, their torture would be in my future light cone, right?

Comment author: Eugine_Nier 15 April 2012 10:56:26PM 1 point [-]

Oops, I just realized that in my hypothetical scenario by someone being tortured outside your light cone, I meant someone being tortured somewhere your two future light cones don't intersect.

Comment author: Vaniver 16 April 2012 01:12:09AM 1 point [-]

Indeed; being outside of my future light cone just means whatever I do has no impact on them. But now not only can I not impact them, but they're also dead to me (as they, or any information they emit, won't exist in my future). I still don't see what impact caring about them has.

Comment author: Eugine_Nier 16 April 2012 02:09:10AM 1 point [-]

Ok, my scenario involves your actions having an effect on them before your two light cones become disjoint.