The Case For Free Will or Why LessWrong must commit to self determination

-18 Troshen 07 April 2014 12:07PM

 

This is intended to eventually be a Main post and part of sequences on free will and religion.  It will be part of the Free Will sequence.

Please comment if you do or do not think this post is ready for Main.  I intend to move it there eventually.  As with any post at LessWrong, I'm completely open to criticism, but I hope it's directed at improving the quality of the thinking here rather than kneejerk opposition to my ideas.

------------------------------------------------------

 

The main point of this post is that I intend to convince every rationalist here, and every causal reader, to commit to allowing others to have free will.

First a bit of background.  I'm a conservative christian.  Growing up I considered myself a rationalist.  Now that I've known about Less Wrong for several years and have read the sequences, I no longer think I can classify myself that way <grin>.  Nowdays I usually consider myself a pragmatist.  "Being a rationalist" now carries with it a significant weight in my mind of formal Bayes Theorem and such that I've never had time to fully follow through and practice.  I also have a little fear that completely committing to be Bayesian would eventually put a huge conflict between my faith and Bayesian reasoning - just a little fear.  I've been reading Less Wrong for years now, they've all been resolve to my satisfaction.  I also haven't simply because looking at the math that gets thrown around here in Bayes Theorem discussion seems like it would take too much time for me to understand, and I'm already very busy (and, being an engineer and not a math major, a bit intimidating).

The main reason I come here is because this community thinks about thinking, which so few people around me do.  I crave that introspection that happens here, and so I'm drawn back to it.  Not always often, but enough to generally stay abreast of what's going on.  (I also have to admit to myself that I come back because you people are very smart, and I want you to think of me as smart too, and have your approval, but I try to keep that in check <grin>)

Now that I've been here (online only - no meetups yet) and learned with you over the years, another reason I stay here is because of the clear success of Evolutionary Psychology in predicting human behavior.  The clearest example I've ever had is this:

My children and I love to chase each other around the house.  It drives my wife crazy, especially when it happens right at bedtime.  At some point after I read about evolutionary psychology, this chain of logic dawned on me: The natural genetic behavior that's successful gets reinforced over generations -> Things you love to do naturally are joyful to you -> You pass those things on to your children through play the way lions play hunt with cubs ->  Human parents and children get true joy from chasing each other because their ancestors loved the hunt and were successful at it!

Now THAT was an eye opener!  It was the answer to a question I'd never known I had, which was this.  Why do children love to chase, and why do I love to chase them?  Because their ancestors survived that way and it was passed to them genetically.  I even like to playfully almost-catch-them-and-let-them-escape.  I even playfully let them catch me, too.  And we love it.

Religion has no answer to this question.  Religion doesn't even know how to ask this question.  But it flowed naturally out of Evolutionary Psychology just by my knowing that the concept existed!  Powerful!  Now, this post isn't really about religion so I won't go into why that doesn't break my faith.  I'll handle that it other posts.  The reason why I'm talking about it now is to get you to recognize that you are a tribal hunter by ancestry, even more fundamentally than you are the descendant of conquerors.  And knowing that Politics Is The Mind Killer, you'll listen to this next part, and take it seriously.

Less Wrong rationalists are growing, and being recognized by the religious community.  As militant Atheists.  It's reported that this is a new thing among atheists, this new desire to spread atheist philosophies as strongly as any religion spreads it's beliefs.  I've seen it in a couple places now, in about the last year.

I have a huge, scary concern for the future of our world.  It's not atheism.  And it's not religion.  I fear future wars.  As a military history enthusiast and a veteran I've learned a lot about war.  A lot.  And the principle is true that those who don't learn from history are doomed to repeat it.  Knowing that we are tribal animals I see aetheists as one tribe and religionists as another.  Now that I see the of growth and success of LW I see a future pattern emerging in the United States:

Few atheists among overwhelming Christians -> shrinking Christianity, growing Atheism -> atheism tribalness growing well connected and strong -> Natural tribal impulse to not tolerate different voices -> war between atheists and Christians.

Don't try to say this won't happen, and that Rationalists will always allow other people to believe differently.  Coherent Extrapolated Volition, Politics is the Mind Killer, and Eliezar' success in creating the LW and rationalist movement say otherwise.  Now, today, the commitment to altruism seems like a solution, but it isn't.  You all here are so very intelligent and you seriously look down on those of faith.  I see it all over the place.  It's a real blind spot that you can't see because it's inside your mental algorithms.  Altruism is very easily perverted into forcing other people because you know what is best for them.  It's not enough by itself.  It needs something else attached.

Someday there will come a time when new leaders will come up trough the rationalist movement who don't have Eliezar's  commitment to freedom.  And power corrupts even good, compassionate people.  So now I come to my request.

This principle needs to the rationalist movement.  A guarantee of free will for others that disagree with you, EVEN IF THEY ARE WRONG.  

I know religions have not always had this either.  Be better than the religions you despise.  Recognize that they also are tribal animals trying to become civilized tribal animals.  

I ask you personally to commit to making free will for all a part of your personal philosophy.  And I ask you to formalize that as part of Less Wrong, the Rationalist community, and your evangelical aetheism.  Plant the seed now so that is has time to grow. It is my fear that if you don't your children's children, and my childrens' children, will know a brutal war of philosophies unlike any we have ever seen.

 

In a future post I'll cover how religions are the empirically determined solution to problems that prevented civilization from arising,  and how rationalism is the modern, more specifically planned version.  And why religion is not evil like you think it is.

 

Sincerely,

Troshen

 

 

 

 

 

 

 

 

 

 

 

Comment author: elharo 15 May 2013 10:19:20PM *  28 points [-]

Boring munchkin technique #2: invest in tax advantaged index funds with low fees. Specifically, in the following order:

  1. Max out your employer's matching contribution, if available. It is near impossible to beat an immediate 50% or 100% return, even if you have to borrow money in order to take advantage of this.

  2. Pay off credit card debt. Do not keep any high interest loans. Do not keep a revolving balance on credit cards.

  3. Depending on circumstances (e.g. if you lose your job, is moving back in with your parents an option?) have a few months of living expenses available in ready cash.

  4. Put as much money as you can afford into tax advantaged retirement accounts. In the U.S. that means 401K, 403b, IRA, SEP, etc.

  5. Allocate all your investments except possibly your emergency fund into low cost index funds. 1% fees are way too high. Vanguard has some good funds with fees as low as 0.1%.

I could say more, but that's the basics. Do that and you'll probably be in the 90th percentile or higher of successful investors. If folks are interested in hearing more, let me know; and I'll whip up a post on rational financial planning. If there's a lot of interest, it might even be worth a sequence.

Comment author: Troshen 03 July 2013 09:54:30PM 1 point [-]

I would also be interested in hearing more about your take on financial planning.

Comment author: Troshen 13 April 2013 06:57:37PM 1 point [-]

I'm not sure if it's better, but here's one that works well. Similar to the phrase, "Physician, heal thyself!" another way to say rationalists should win is to say, "Rationalist, improve thyself!"

If you aren't actually improving yourself and the world around you, then you aren't using the tools of rationality correctly. And it follows that to improve the world around you, you first have to be in a position to do so by doing the same to yourself.

Comment author: Kenoubi 28 February 2013 01:56:34PM 2 points [-]

Yeah, I was gonna say pretty much exactly this. It may be the advice the most likely to lead to general dating success over the long term, but it really doesn't help me deal with my situation right now. (Though I certainly didn't downvote it.)

Comment author: Troshen 06 March 2013 03:26:16PM 0 points [-]

If It seemed like I meant he should ditch her and move on, I apologize.

My main point was basically "what would future me say to past me when I felt that way?" And that most definitely is "Don't worry, it'll work out." Because that's the best advice of all.

It'll either work out positively in which case you'll have a good relationship, or it'll work out negatively in which case you'll have some good memories and some hard lessons learned for next time. And if you can think of those let downs as one more layer of thicker skin to help you not worry about them, you'll be better off.

I know it doesn't seem that way now, but there really is nothing to worry about.

Comment author: Troshen 28 February 2013 12:12:57AM 2 points [-]

From personal experience the best advice is to date a lot and get hurt a lot and build up a thick enough skin to where you don't care anymore about the rejections.

Worrying about the rejection will only make rejection more likely.

Act as if you are a confident person, then other people think you are confident, and you'll become more confident. While of course actually trying to do things to actually become more capable too, since that improves your confidence as well.

The other ideas here also are good techniques too, but what I found is that when I had been burned enough to stop caring about rejection was when I suddenly became successful at dating. The main thing that had changed was not worrying about it.

Comment author: Izeinwinter 25 February 2013 10:55:33PM *  0 points [-]

, I have given some thought to this specific problem - not just asteroids, but the fact that any spaceship is potentially a weapon, and as working conditions go, extended isolation does not have the best of records on the mental stability front.

Likely solutions: Full automation and one-time-pad locked command and control - This renders it a weapon as well controlled as nuclear arsenals, except with longer lead times on any strike, so even safer from a MAD perspective. (... and no fully private actor ever gets to run them. ) Or if full automation is not workable, a good deal of effort expended on maintaining crew sanity - Psyc/political officers - called something nice, fluffy, and utterly anodyne to make people forget just how much authority they have, backed up with a remote controlled self destruct. Again, one time pad com lock. It's not going to be a libertarian free for-all as industries go, more a case of "Extremely well paid, to make up for the conditions and the sword that will take your head if you crack under the pressure" Good story potential in that, though.

Comment author: Troshen 25 February 2013 11:43:36PM 0 points [-]

I think we're heading off-topic with this one, and I'd like to continue the discussion and focus it on space, not just whether to reveal or keep secrets.

So I started this thread: http://lesswrong.com/r/discussion/lw/gsv/asteroids_and_spaceships_are_kinetic_bombs_and/

Asteroids and spaceships are kinetic bombs and how to prevent catastrophe

6 Troshen 25 February 2013 11:33PM

A reality of physics, and one that doesn't get much play in science fiction, is that as soon as humanity gains space travel, anyone in the asteroid mining or space travel business will have city-busting capabilities at their fingertips.

It's there in classic sci-fi, but not so much recently.

This discussion was started in the comments to:

http://lesswrong.com/lw/gln/a_brief_history_of_ethically_concerned_scientists/

 

In the "Ethically Concerned Scientists" post, Izeinwinter commented:

 

, I have given some thought to this specific problem - not just asteroids, but the fact that any spaceship is potentially a weapon, and as working conditions go, extended isolation does not have the best of records on the mental stability front.

Likely solutions: Full automation and one-time-pad locked command and control - This renders it a weapon as well controlled as nuclear arsenals, except with longer lead times on any strike, so even safer from a MAD perspective. (... and no fully private actor ever gets to run them. ) Or if full automation is not workable, a good deal of effort expended on maintaining crew sanity - Psyc/political officers - called something nice, fluffy, and utterly anodyne to make people forget just how much authority they have, backed up with a remote controlled self destruct. Again, one time pad com lock. It's not going to be a libertarian free for-all as industries go, more a case of "Extremely well paid, to make up for the conditions and the sword that will take your head if you crack under the pressure" Good story potential in that, though.

 

A great start to a discussion here. 

You've considered people going loons and some general security, but it would then become a hacker war along the lines of who could break the security and gain control of the space ships.

It doesn't address the problem of the leaders using the ships as threat weapons, since they have legitimate control, but can still make terrorist decisions.

And I'm terrified of your idea of turning spaceflight, which I see as the ultimate freedom, along the lines of Niven's Belters, into a state-controlled affair like the Soviet navy with political officers.

Now, one thing I think is a useful safety control that doesn't lead to worse problems is the destruct option.  All major rockets have them right now, since if it goes out of control it's a huge hazard for a great distance.  And although I don't like the idea of all personal spaceships being under a safety officers thumb, it might be better than the alternative of terrorist groups gaining control of asteroid mines and holding the world hostage.

 

You're right about great story potential though, in any of these scenarios.

 

Comment author: asparisi 10 February 2013 09:48:35PM 2 points [-]

I find it unlikely that scientific secrecy is never the right answer, just as I find it unlikely that scientific secrecy is always the right answer.

Qualitatively, I'd say it has something to do with the ratio of expected harm of immediate discovery vs. the current investment and research in the field. If the expected risks are low, by all means publish so that any risks that are there will be found. If the risks are high, consider the amount of investment/research in the field. If the investment is high, it is probably better to reveal your research (or parts of it) in the hope of creating a substantive dialogue about risks. If the investment is low, it is less likely that anyone will come up with the same discovery and so you may want to keep it a secret. This probably also varies by field with respect to how many competing paradigms are available and how incremental the research is: psychologists work with a lot of different theories of the mind, many of which do not explicitly endorse incremental theorizing, so it is less likely that a particular piece of research will be duplicated while biologists tend to have larger agreement and their work tends to be more incremental, making it more likely that a particular piece of research will be duplicated.

Honestly, I find cases of alternative pleading such as V_V's post here suspect. It is a great rhetorical tool, but reality isn't such that alternative pleading actually can map onto the state of the world. "X won't work, you shouldn't do X in cases where it does work, and even if you think you should do X, it won't turn out as well" is a good way to persuade a lot of different people, but it can't actually map onto anything.

Comment author: Troshen 25 February 2013 10:49:57PM 0 points [-]

This is a good discussion of the trade-offs that should be considered when deciding to reveal or keep secret new, dangerous technologies.

Comment author: ewbrownv 12 February 2013 10:59:33PM *  7 points [-]

Good insight.

No, even a brief examination of history makes it clear that the lethality of warfare is almost completely determined by the culture and ideology of the people involved. In some wars the victors try to avoid civilian casualties, while in others they kill all the adult males or even wipe out entire populations. Those fatalities dwarf anything produced in the actual fighting, and they can and have been inflicted with bronze age technology. So anyone interested making war less lethal would be well advised to focus on spreading tolerant ideologies rather than worrying about weapon technology.

As for the casualty rate of soldiers, that tends to jump up whenever a new type of weapon is introduced and then fall again as tactics change to deal with it. In the long run the dominant factor is again a matter of ideology - an army that tries to minimize casualties can generally do so, while one that sees soldiers as expendable will get them killed in huge numbers regardless of technology.

(BTW, WWI gases are nothing unusual in the crippling injury department - cannons, guns, explosives and edged weapons all have a tendency to litter the battlefield with crippled victims as well. What changed in the 20th century was that better medical meant a larger fraction of crippled soldiers to survive their injuries to return to civilian life.)

Comment author: Troshen 25 February 2013 10:39:27PM *  0 points [-]

"So anyone interested making war less lethal would be well advised to focus on spreading tolerant ideologies rather than worrying about weapon technology."

This is actually one of the major purposes that Christians have had in doing missionary work - to spread tolerance and reduce violence. I assume it's happened in other religions too. For example, the rules of chivalry in the middle ages were an attempt to moderate the violence and abuses of the warriors.

Comment author: Izeinwinter 12 February 2013 02:08:26PM 6 points [-]

You are missing a major problem. Not "secrecy will kill progress" That is, in this context, a lesser problem. The major problem is that scientific secrecy would eventually kill the planet.

In a context of ongoing research and use of any discipline, Dangerous techniques must be published, or they will be duplicated over and over again, until they cause major damage. If the toxicity of Dimethylmercury was a secret, chemical laboratories and entire college campuses dying slowly, horrifically and painfully would be regular occurrences. No scientific work is done without a context, and so all discoveries will happen again. If you do not flag any landmines you spot , someone not-quite-as-sharp will eventually reach the same territory and step on them. If you find a technique you consider a threat to the world, it is now your problem to deal with, and secrecy is never going to be a sufficient response, but is instead merely an abdication of moral responsibility onto the next person to get there.

Comment author: Troshen 25 February 2013 10:16:12PM *  0 points [-]

This is an extremely important point. Historically it might take a long time, if ever, for someone else to come to a similar discovery that you just made. For example, Leonardo's submarines. But that was when only a tiny fraction of humanity devoted time to experiments. His decision to hide his invention kicked the can of secret attacks by submarines many years down the road and may have saved many lives. (I'm not so sure - leaders who wanted wars I'm sure found other secret plots and strategems, but at least he exercised his agency to not be the father of them)

But things are different now. You can be practically guaranteed that if you are working on something, someone else in the world is working on it too, or will be soon. Being at a certain place and time in your industry puts you in a position to see the possible next steps, and you aren't alone.

If you see something dangerous that others don't, the best bet is to talk about it. More minds thinking and talking about it from multiple different perspectives have the best chance to solve it.

Communication is a great, helpful key to survival. I think we had it when the U.S. and the Soviets didn't annihilate the world when the U.S. policy was Mutual Assured Destruction. And I think we didn't have it in the U.S. Civil War and in WWI, when combat technology had raced ahead of the knowledge and training of the generals of those wars, and that led to shocking massacres unintended by either side.

An example other than unfriendly AI is asteroid mining and serious space travel in general. Right now we have the dangers from asteroids. But the ability to controllably move mass in orbit would inevitably become one of the most powerful weapons ever seen. Unless people make a conscious choice not to use it for that. Although I've wanted to write fiction stories about it and work on it, I've actually hesitated for the simple fact that I think it's inevitable that it will become a weapon.

This post makes me confident. The action most likely to lead to humanity's growth and survival is to talk about it openly. First because we're already vulnerable to asteroids and can't do anything about it. And second because talking about it raises awareness of the problem so that more people can focus on solving it.

I really think that avoiding nuclear war is an example. When I was a teenager everyone just assumed we'd all die in a nuclear war someday. Eventually through a deliberate war or an accident or a skynet-style-terminator incident civilization as a whole would be gone. And eventually that fear just evaporated. I think it's because we as a culture kept talking about it so much and not leaving it up to only a few monarchic leaders.

So I'm changing my outlook and plans based on this post and this comment. I plan to talk about and promote asteroid mining and write short stories about terrorists dropping asteroids on cities. To talk about it it is better in the long run.

View more: Next