Comment author: elharo 15 May 2013 10:19:20PM *  28 points [-]

Boring munchkin technique #2: invest in tax advantaged index funds with low fees. Specifically, in the following order:

  1. Max out your employer's matching contribution, if available. It is near impossible to beat an immediate 50% or 100% return, even if you have to borrow money in order to take advantage of this.

  2. Pay off credit card debt. Do not keep any high interest loans. Do not keep a revolving balance on credit cards.

  3. Depending on circumstances (e.g. if you lose your job, is moving back in with your parents an option?) have a few months of living expenses available in ready cash.

  4. Put as much money as you can afford into tax advantaged retirement accounts. In the U.S. that means 401K, 403b, IRA, SEP, etc.

  5. Allocate all your investments except possibly your emergency fund into low cost index funds. 1% fees are way too high. Vanguard has some good funds with fees as low as 0.1%.

I could say more, but that's the basics. Do that and you'll probably be in the 90th percentile or higher of successful investors. If folks are interested in hearing more, let me know; and I'll whip up a post on rational financial planning. If there's a lot of interest, it might even be worth a sequence.

Comment author: Troshen 03 July 2013 09:54:30PM 1 point [-]

I would also be interested in hearing more about your take on financial planning.

Comment author: Troshen 13 April 2013 06:57:37PM 1 point [-]

I'm not sure if it's better, but here's one that works well. Similar to the phrase, "Physician, heal thyself!" another way to say rationalists should win is to say, "Rationalist, improve thyself!"

If you aren't actually improving yourself and the world around you, then you aren't using the tools of rationality correctly. And it follows that to improve the world around you, you first have to be in a position to do so by doing the same to yourself.

Comment author: Kenoubi 28 February 2013 01:56:34PM 2 points [-]

Yeah, I was gonna say pretty much exactly this. It may be the advice the most likely to lead to general dating success over the long term, but it really doesn't help me deal with my situation right now. (Though I certainly didn't downvote it.)

Comment author: Troshen 06 March 2013 03:26:16PM 0 points [-]

If It seemed like I meant he should ditch her and move on, I apologize.

My main point was basically "what would future me say to past me when I felt that way?" And that most definitely is "Don't worry, it'll work out." Because that's the best advice of all.

It'll either work out positively in which case you'll have a good relationship, or it'll work out negatively in which case you'll have some good memories and some hard lessons learned for next time. And if you can think of those let downs as one more layer of thicker skin to help you not worry about them, you'll be better off.

I know it doesn't seem that way now, but there really is nothing to worry about.

Comment author: Troshen 28 February 2013 12:12:57AM 2 points [-]

From personal experience the best advice is to date a lot and get hurt a lot and build up a thick enough skin to where you don't care anymore about the rejections.

Worrying about the rejection will only make rejection more likely.

Act as if you are a confident person, then other people think you are confident, and you'll become more confident. While of course actually trying to do things to actually become more capable too, since that improves your confidence as well.

The other ideas here also are good techniques too, but what I found is that when I had been burned enough to stop caring about rejection was when I suddenly became successful at dating. The main thing that had changed was not worrying about it.

Comment author: Izeinwinter 25 February 2013 10:55:33PM *  0 points [-]

, I have given some thought to this specific problem - not just asteroids, but the fact that any spaceship is potentially a weapon, and as working conditions go, extended isolation does not have the best of records on the mental stability front.

Likely solutions: Full automation and one-time-pad locked command and control - This renders it a weapon as well controlled as nuclear arsenals, except with longer lead times on any strike, so even safer from a MAD perspective. (... and no fully private actor ever gets to run them. ) Or if full automation is not workable, a good deal of effort expended on maintaining crew sanity - Psyc/political officers - called something nice, fluffy, and utterly anodyne to make people forget just how much authority they have, backed up with a remote controlled self destruct. Again, one time pad com lock. It's not going to be a libertarian free for-all as industries go, more a case of "Extremely well paid, to make up for the conditions and the sword that will take your head if you crack under the pressure" Good story potential in that, though.

Comment author: Troshen 25 February 2013 11:43:36PM 0 points [-]

I think we're heading off-topic with this one, and I'd like to continue the discussion and focus it on space, not just whether to reveal or keep secrets.

So I started this thread: http://lesswrong.com/r/discussion/lw/gsv/asteroids_and_spaceships_are_kinetic_bombs_and/

Comment author: asparisi 10 February 2013 09:48:35PM 2 points [-]

I find it unlikely that scientific secrecy is never the right answer, just as I find it unlikely that scientific secrecy is always the right answer.

Qualitatively, I'd say it has something to do with the ratio of expected harm of immediate discovery vs. the current investment and research in the field. If the expected risks are low, by all means publish so that any risks that are there will be found. If the risks are high, consider the amount of investment/research in the field. If the investment is high, it is probably better to reveal your research (or parts of it) in the hope of creating a substantive dialogue about risks. If the investment is low, it is less likely that anyone will come up with the same discovery and so you may want to keep it a secret. This probably also varies by field with respect to how many competing paradigms are available and how incremental the research is: psychologists work with a lot of different theories of the mind, many of which do not explicitly endorse incremental theorizing, so it is less likely that a particular piece of research will be duplicated while biologists tend to have larger agreement and their work tends to be more incremental, making it more likely that a particular piece of research will be duplicated.

Honestly, I find cases of alternative pleading such as V_V's post here suspect. It is a great rhetorical tool, but reality isn't such that alternative pleading actually can map onto the state of the world. "X won't work, you shouldn't do X in cases where it does work, and even if you think you should do X, it won't turn out as well" is a good way to persuade a lot of different people, but it can't actually map onto anything.

Comment author: Troshen 25 February 2013 10:49:57PM 0 points [-]

This is a good discussion of the trade-offs that should be considered when deciding to reveal or keep secret new, dangerous technologies.

Comment author: ewbrownv 12 February 2013 10:59:33PM *  7 points [-]

Good insight.

No, even a brief examination of history makes it clear that the lethality of warfare is almost completely determined by the culture and ideology of the people involved. In some wars the victors try to avoid civilian casualties, while in others they kill all the adult males or even wipe out entire populations. Those fatalities dwarf anything produced in the actual fighting, and they can and have been inflicted with bronze age technology. So anyone interested making war less lethal would be well advised to focus on spreading tolerant ideologies rather than worrying about weapon technology.

As for the casualty rate of soldiers, that tends to jump up whenever a new type of weapon is introduced and then fall again as tactics change to deal with it. In the long run the dominant factor is again a matter of ideology - an army that tries to minimize casualties can generally do so, while one that sees soldiers as expendable will get them killed in huge numbers regardless of technology.

(BTW, WWI gases are nothing unusual in the crippling injury department - cannons, guns, explosives and edged weapons all have a tendency to litter the battlefield with crippled victims as well. What changed in the 20th century was that better medical meant a larger fraction of crippled soldiers to survive their injuries to return to civilian life.)

Comment author: Troshen 25 February 2013 10:39:27PM *  0 points [-]

"So anyone interested making war less lethal would be well advised to focus on spreading tolerant ideologies rather than worrying about weapon technology."

This is actually one of the major purposes that Christians have had in doing missionary work - to spread tolerance and reduce violence. I assume it's happened in other religions too. For example, the rules of chivalry in the middle ages were an attempt to moderate the violence and abuses of the warriors.

Comment author: Izeinwinter 12 February 2013 02:08:26PM 6 points [-]

You are missing a major problem. Not "secrecy will kill progress" That is, in this context, a lesser problem. The major problem is that scientific secrecy would eventually kill the planet.

In a context of ongoing research and use of any discipline, Dangerous techniques must be published, or they will be duplicated over and over again, until they cause major damage. If the toxicity of Dimethylmercury was a secret, chemical laboratories and entire college campuses dying slowly, horrifically and painfully would be regular occurrences. No scientific work is done without a context, and so all discoveries will happen again. If you do not flag any landmines you spot , someone not-quite-as-sharp will eventually reach the same territory and step on them. If you find a technique you consider a threat to the world, it is now your problem to deal with, and secrecy is never going to be a sufficient response, but is instead merely an abdication of moral responsibility onto the next person to get there.

Comment author: Troshen 25 February 2013 10:16:12PM *  0 points [-]

This is an extremely important point. Historically it might take a long time, if ever, for someone else to come to a similar discovery that you just made. For example, Leonardo's submarines. But that was when only a tiny fraction of humanity devoted time to experiments. His decision to hide his invention kicked the can of secret attacks by submarines many years down the road and may have saved many lives. (I'm not so sure - leaders who wanted wars I'm sure found other secret plots and strategems, but at least he exercised his agency to not be the father of them)

But things are different now. You can be practically guaranteed that if you are working on something, someone else in the world is working on it too, or will be soon. Being at a certain place and time in your industry puts you in a position to see the possible next steps, and you aren't alone.

If you see something dangerous that others don't, the best bet is to talk about it. More minds thinking and talking about it from multiple different perspectives have the best chance to solve it.

Communication is a great, helpful key to survival. I think we had it when the U.S. and the Soviets didn't annihilate the world when the U.S. policy was Mutual Assured Destruction. And I think we didn't have it in the U.S. Civil War and in WWI, when combat technology had raced ahead of the knowledge and training of the generals of those wars, and that led to shocking massacres unintended by either side.

An example other than unfriendly AI is asteroid mining and serious space travel in general. Right now we have the dangers from asteroids. But the ability to controllably move mass in orbit would inevitably become one of the most powerful weapons ever seen. Unless people make a conscious choice not to use it for that. Although I've wanted to write fiction stories about it and work on it, I've actually hesitated for the simple fact that I think it's inevitable that it will become a weapon.

This post makes me confident. The action most likely to lead to humanity's growth and survival is to talk about it openly. First because we're already vulnerable to asteroids and can't do anything about it. And second because talking about it raises awareness of the problem so that more people can focus on solving it.

I really think that avoiding nuclear war is an example. When I was a teenager everyone just assumed we'd all die in a nuclear war someday. Eventually through a deliberate war or an accident or a skynet-style-terminator incident civilization as a whole would be gone. And eventually that fear just evaporated. I think it's because we as a culture kept talking about it so much and not leaving it up to only a few monarchic leaders.

So I'm changing my outlook and plans based on this post and this comment. I plan to talk about and promote asteroid mining and write short stories about terrorists dropping asteroids on cities. To talk about it it is better in the long run.

Comment author: V_V 08 October 2012 02:59:18PM 1 point [-]

Sure, but the way to practice these skills is to apply them to actual problems, not to mindlessly recite their principles.

Recitation and worship can turn even good rational principles into articles of faith, disconnected from anything else, which you just "believe to believe" rather actually understand and apply.

Comment author: Troshen 12 October 2012 12:04:40AM *  1 point [-]

V_V and Vaniver both make really good points, but the fact is that the U.S was not built to be completely rationalist, and people in general are not rationalists.

It's a communal set of rules for a people and a place that's designed to give the members the most freedom while still ensuring stability and order. And it has a really good track record of success in doing that.

I agree that it's not an optimal solution in a future, ideally rationalist world. But it's not a tool for teaching children to think for themselves. It's a tool to get them to follow the social rules. And I'll tell you, children want their own way and DO NOT want to follow rules. And if you let them have their way all the time you WILL spoil them. There's a time to teach rules-following (especially rules that protect liberties and freedoms) and a time to teach mistrust of authority and rules-breaking.

What other device would you propose for a future, ideally rationalist world? I'm not being fecetious here. I'm curious. Spawned by the Wierdtopia idea, can you think of a better solution?

I personally think of it as like teaching an apprentice. Apprentices weren't taught the why's. They were taught the how's. As a journeyman and a master you discovered the why's. Kids are apprentice citizens.

Comment author: Epiphany 09 October 2012 03:38:41AM 1 point [-]

One interesting thing to note is that if you're accustomed to pledging your allegiance to something every day as a child, while you're still unable to enter into legal agreements and aren't thinking about them, it may not occur to you that when you go to school on your 18th birthday, you've just pledged your allegiance in a way that... might be legally binding?

Regardless of what sort of government expects it's children to pledge allegiance every day, do you agree with the practice of making people pledge allegiance?

Allegiance is kind of vague. It could be interpreted to mean doing normal responsibilities (not being a criminal, paying your taxes) or it might be interpreted to mean total obedience. I'm not sure whether to agree or disagree with the pledge. Maybe I should disagree with it on the grounds that it is too vague and therefore doesn't protect reciters from feeling obligated to obey a tyrant, were one to end up in power.

Comment author: Troshen 11 October 2012 11:46:55PM 3 points [-]

This is actually has been a problem with real-life examples. I've read that the oaths in NAZI Germany were specifically to Hitler himself, and that many members of the military felt bound by their oaths to obey orders, even when it was clear the orders shouldn't be obeyed. I think the critical danger is in giving oaths to an individual (any of which have a very real chance of being corrupted by power, unless they take action to prevent it).

I see the difference that the U.S. pledge of alliegence is to the republic and it's symbol, the flag. The saving factors to prevent abuses of power are:

The focus on alliegence to the nation as a whole, including all it's members, it's leaders, and it's ideals.

The "with liberty and justice for all" line, which is the guarantee of what the State offers in return. The U.S. has to be worthy of the alliegence.

The extreme other war example is the U.S Civil War, where many military officers left the army to join the Confederacy. They formed ranks and marched right out of West Point because they opposed the U.S. leadership. And the soldiers who stayed let them go, knowing they were going to help the seceding states fight. Even if they disagreed, it was felt the honorable thing to do was to let them go.

This idea shows up specifically in our military training and culture in the definition of lawful orders. The military culture and legal rules define your duty to obey all lawful orders from your chain of command, up to the President. So that if you feel that an order is unlawful it's actually your duty to disobey. Now, of course, that carries with it all the weight of being the first one to be the opposition, so it's no guarantee to prevent abuses of power, but it does exist.

I gues my point is that the danger is in making oaths to a person.

I agree that it's a form of indoctrination for children. But as long as the trade of alliegence and freedom it describes is a true and real one, I think it's a good thing to keep those principles in their minds.

View more: Next