Pfft comments on Voting is like donating thousands of dollars to charity - Less Wrong

32 Post author: Academian 05 November 2012 01:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (210)

You are viewing a single comment's thread.

Comment author: Pfft 05 November 2012 02:59:04AM *  6 points [-]

If one Virginia voter does an expected 1/(3.5 million)*($7 trillion) = $2 million good by voting for candidate X, then there is another Virginia voter that does an expected $2 million of damage by voting for candidate Y. It seems that either

  1. Roughly half of the population is misinformed about which alternative is objectively better. In that case, how do I justify a belief that I have a greater than 50% chance of being right, when everyone else has access to the same information?

  2. There are real differences in values, and by my vote I direct the outcome towards my preference instead of the other Virginia voter's. In that case, sure I want to vote, but should we really call it altruism?

Comment author: Eugine_Nier 05 November 2012 03:33:25AM 4 points [-]

Roughly half of the population is misinformed about which alternative is objectively better. In that case, how do I justify a belief that I have a greater than 50% chance of being right, when everyone else has access to the same information?

Well, you can replace "which alternative is objectively better" with any other belief on which opinions differ and the same argument applies.

Comment author: ricketson 05 November 2012 05:01:19AM 0 points [-]

"any other belief"

This invites us to look at why beliefs differ. First we have to acknowledge that we are talking about differences between people with comparable levels of expertise, so this isn't the same as the disagreements that exist between experts and novices.

For elections, I think we can say that people disagree in large part because the situation is incredibly complicated. It it hard to know how government policies will affect human welfare, and it is hard to know how elected officials will shape government policy.

The only interesting factor that I can think of is differences in our scope of altruism -- one voter may feel altruistic towards their city, while another focuses on the nation, and a third focuses on all of humanity.

Comment author: handoflixue 06 November 2012 12:42:50AM 2 points [-]

"First we have to acknowledge that we are talking about differences between people with comparable levels of expertise"

The assertion that the vast majority of voters have done a sizeable amount of research, rather than simply voting "along party lines" or "like mom always did" or "because dad was overcontrolling and I'm not going to support HIS party" strikes me as the sort of assertion that would require quite a lot of evidence.

One can reasonably conclude that in politics, as with math, the "average person" is ignorant and their opinion is not based on any sort of expertise.

Comment author: ricketson 14 February 2013 06:08:55AM 0 points [-]

"One can reasonably conclude that in politics, as with math, the "average person" is ignorant and their opinion is not based on any sort of expertise."

Even if you limit the population to those who are well informed, that population is still rather evenly split and so his points still hold.

Comment author: handoflixue 14 February 2013 09:31:12PM -1 points [-]

Even if you limit the population to those who are well informed, that population is still rather evenly split

On some issues, probably. On others, you have the well-informed, educated, cares-about-facts types versus the religious fanatics who want to push their religious agenda, or their personal agenda, or support pork-barrel funding of pet projects, or want to waste extravagant amounts on feel-good charity that accomplishes nothing in the end.

I don't think either political party in the US has a monopoly on educated - it's easier for me to demonize and strawman Republicans since I was raised Democratic. Apologies if my examples thus seem biased in that direction.

So, yes, sometimes, it's clear my opponent has a genuine, reasoned stance. Sometimes, it's equally clear that they don't. It's important to be aware that sometimes the opposing side doesn't have any rational objections because they're wrong.

Comment author: Decius 06 November 2012 05:45:50AM 2 points [-]

How are you measuring 'objectively better'?

Roughly half the population is paperclip maximizers.

Comment author: Pfft 06 November 2012 06:04:41AM 2 points [-]

That situation is alternative 2.

Comment author: Manfred 05 November 2012 10:44:38AM 7 points [-]

Roughly half of the population is misinformed about which alternative is objectively better. In that case, how do I justify a belief that I have a greater than 50% chance of being right, when everyone else has access to the same information?

Non-meta calculations, like usual. If someone else thinks the indefinite integral of x^2 is 3x^3, I don't say "well, if we have the same information, I must have a 50% chance of being wrong." Instead, I check the result using boring, ordinary math, and go "nope, looks like it's x^3 / 3."

should we really call it altruism?

Yes.

Comment author: fortyeridania 05 November 2012 12:19:37PM 2 points [-]

I agree with your approach to solving disagreements about integrals. I do not see how it applies to politics, where disagreements are far more diverse, including factual, moral, and unconscious conflicts.

Comment author: Manfred 05 November 2012 03:00:58PM 2 points [-]

Well, people do differ in values, but it seems like more often some people are just wrong. Viz: global warming as a factual disagreement.

So what do you do if half the population disagrees with you about a factual issue? (Copy and paste time!) I don't say 'well, if we have the same information, I must have a 50% chance of being wrong.' Instead, I check the result using boring, ordinary scholarship, and go 'nope, looks like there's a mechanism for CO2 to cause the atmosphere to warm up.'

Note that a key part of this process is that if you're wrong, you should notice sometimes - there's no "checking" otherwise, just pretend-checking. So that's a good skill to work on.

Comment author: fortyeridania 05 November 2012 03:19:55PM 2 points [-]

That some people are "just wrong" is not at issue. Even mistaken people agree that some people are wrong. (They just think it's the right-thinking folks who are in error.)

I don't say 'well, if we have the same information, I must have a 50% chance of being wrong.'

Of course you don't. If half the population disagrees with you about an issue, you should interpret that as evidence that you are incorrect. How strong the evidence is, depends on how likely they are to possess information you don't, to be misled by things you've prepared yourself for, etc.

Comment author: Manfred 05 November 2012 03:57:22PM 1 point [-]

Agreed. I guess what makes checking the math work in the integral case is just that the better you are at checking the arguments, the less you have to worry about what other people think.

Comment author: khafra 05 November 2012 05:47:28PM 1 point [-]

In other words, people who are convinced by this argument are more likely than the average person to be correct about the objectively better candidate it convinces them to vote for?

Comment author: wedrifid 05 November 2012 03:25:44AM *  3 points [-]

Roughly half of the population is misinformed about which alternative is objectively better. In that case, how do I justify a belief that I have a greater than 50% chance of being right, when everyone else has access to the same information?

"Voting is irrational unless you are arrogant?"

There are real differences in values, and by my vote I direct the outcome towards my preference instead of the other Virginia voter's. In that case, sure I want to vote, but should we really call it altruism?

You can still call it altruism, and it can be helpful to distinguish "selfishness" in the sense usually considered for decision problems from "altruism". The example I like to propose for illustration is the Codependent Prisoner's Dillema, which has Romeo and Juliet as the prisoners who are each obsessed with the other's wellbeing and the jailers use this fact when manipulating them. So when Romeo is "selfishly maximising his own preferences" and picking the option that puts him away for 10 years but lets Juliet go free he is also being "altruistic" towards Juliet while brutally ignoring her preference that she be the one who gets to be the martyr.