Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: ChristianKl 14 February 2014 11:59:03AM 2 points [-]

. I guess this is because the ethics seem obvious to me: of course we should prevent people from developing a "supervirus" or whatever, just as we try to prevent people from developing nuclear arms or chemical weapons.

Of course the ethics are obvious. The road to hell is paved with good intentions. 200 years ago burning all those fossil fuels to power steam engines sounded like a really great idea.

If you simply try to solve problems created by people adopting technology by throwing more technology at it, that's dangerous.

The wise way is to understand the problem you are facing and do specific intervention that you believe to help. CFAR style rationality training might sound less impressive then changing around peoples neurology but it might be an approach with a lot less ugly side effects.

CFAR style rationality training might seem less technological to you. That's actually a good thing because it makes it easier to understand the effects.

The fact that solar itself is getting less expensive is great, but unfortunately the changing over from fossil fuels to solar (e.g. phasing out old power plants and building brand new ones) is still incredibly expensive.

It depends on what issue you want to address. Given how things are going technology involves in a way where I don't think we have to fear that we will have no energy when coal runs out. There plenty of coal around and green energy evolves fast enough for that task.

On the other hand we don't want to turn that coal. I want to eat tuna that's not full of mercury and there already a recommendation from the European Food Safety Authority against eating tuna every day because there so much mercury in it. I want less people getting killed via fossil fuel emissions. I also want to have less greenhouse gases in the atmosphere.

is still incredibly expensive.

If you want to do policy that pays off in 50 years looking at how things are at the moment narrows your field of vision too much.

If solar continues it's price development and is 1/8 as cheap in 21 years you won't need government subsidies to get people to prefer solar over coal. With another 30 years of deployment we might not burn any coal in 50 years.

disheartening roadblocks in the way (utility companies, lack of government will, etc.).

If you think lack of government will or utility companies are the core problem, why focus on changing human neurology? Addressing politics directly is more straightforward.

When it comes to solar power it might also be that nobody will use any solar panels in 50 years because Craig Venter's algae are just a better energy source. Betting to much on single cards is never good.

Comment author: ricketybridge 14 February 2014 06:51:37PM 0 points [-]

CFAR style rationality training might sound less impressive then changing around peoples neurology but it might be an approach with a lot less ugly side effects.

It's a start, and potentially fewer side effects is always good, but think of it this way: who's going to gravitate towards rationality training? I would bet people who are already more rational than not (because it's irrational not to want to be more rational). Since participants are self-selected, a massive part of the population isn't going to bother with that stuff. There are similar issues with genetic and neurological modifications (e.g. they'll be expensive, at least initially, and therefore restricted to a small pool of wealthy people), but given the advantages over things like CFAR I've already mentioned, it seems like it'd be worth it...

I have another issue with CFAR in particular that I'm reluctant to mention here for fear of causing a shit-storm, but since it's buried in this thread, hopefully it'll be okay. Admittedly, I only looked at their website rather than actually attending a workshop, but it seems kind of creepy and culty--rather reminiscent of Landmark, for reasons not the least of which is the fact that it's ludicrously, prohibitively expensive (yes, I know they have "fellowships," but surely not that many. And you have to use and pay for their lodgings? wtf?). It's suggestive of mind control in the brainwashing sense rather than rationality. (Frankly, I find that this forum can get that way too, complete with shaming thought-stopping techniques (e.g. "That's irrational!"). Do you (or anyone else) have any evidence to the contrary? (I know this is a little off-topic from my question -- I could potentially create a workshop that I don't find culty -- but since CFAR is currently what's out there, I figure it's relevant enough.)

Given how things are going technology involves in a way where I don't think we have to fear that we will have no energy when coal runs out. There plenty of coal around and green energy evolves fast enough for that task.

You could be right, but I think that's rather optimistic. This blog post speaks to the problems behind this argument pretty well, I think. Its basic gist is that the amount of energy it will take to build sufficient renewable energy systems demands sacrificing a portion of the economy as is, to a point that no politician (let alone the free market) is going to support.

This brings me to your next point about addressing politics instead of neurology. Have you ever tried to get anything changed politically...? I've been involved in a couple of movements, and my god is it discouraging. You may as well try to knock a brick wall down with a feather. It basically seems that humanity is just going to be the way it is until it is changed on a fundamental level. Yes, I know society has changed in many ways already, but there are many undesirable traits that seem pretty constant, particularly war and inequality.

As for solar as opposed to other technologies, I am a bit torn as to whether it might be better to work on developing technologies rather than whatever seems most practical now. Fusion, for instance, if it's actually possible, would be incredible. I guess I feel that working on whatever's practical now is better for me, personally, to expend energy on since everything else is so speculative. Sort of like triage.

Comment author: Snorri 14 February 2014 03:58:35AM 5 points [-]

This along with the semi-regular accounts of downvote abuse makes me question what advantages a +/- system has over a strictly + system. The ego threat of being downvoted seems more like a contribution deterrent than a learning signal. Is there anyone who could explain to me why the current system is better?

Comment author: ricketybridge 14 February 2014 05:30:15AM -1 points [-]

Completely agreed. That's why some subs only do +, no -. I cannot defend the current system. ;-)

Comment author: ChristianKl 14 February 2014 12:42:27AM 0 points [-]

For instance, humans are so facepalmingly bad at making decisions for the long term (viz. climate change, running out of fossil fuels) that it seems clear that genetic or neurological enhancements would be highly beneficial in changing this

I think you underrate the existential risks that come along with substantial genetic or neurological enhancements. I'm not saying we shouldn't go there but it's no easy subject matter. It requires a lot of thought to address it in a way that doesn't produce more problems than it solves.

For example the toolkit that you need for genetic engineering can also be used to create artificial pandemics which happen to be the existential risk most feared by people in the last LW surveys.

When it comes to running out of fossil fuels we seem to do quite well. Solar energy halves costs every 7 years. The sun doesn't shine the whole day so there's still further work to be done, but it doesn't seem like an insurmountable challenge.

Comment author: ricketybridge 14 February 2014 01:39:01AM 0 points [-]

I think you underrate the existential risks that come along with substantial genetic or neurological enhancements.

It's true, I absolutely do. It irritates me. I guess this is because the ethics seem obvious to me: of course we should prevent people from developing a "supervirus" or whatever, just as we try to prevent people from developing nuclear arms or chemical weapons. But steering towards a possibly better humanity (or other sentient species) just seems worth the risk to me when the alternative is remaining the violent apes we are. (I know we're hominds, not apes; it's just a figure of speech.)

When it comes to running out of fossil fuels we seem to do quite well. Solar energy halves costs every 7 years.

That's certainly a reassuring statistic, but a less reassuring one is that solar power currently supplies less than one percent of global energy usage!! Changing that (and especially changing that quickly) will be an ENORMOUS undertaking, and there are many disheartening roadblocks in the way (utility companies, lack of government will, etc.). The fact that solar itself is getting less expensive is great, but unfortunately the changing over from fossil fuels to solar (e.g. phasing out old power plants and building brand new ones) is still incredibly expensive.

Comment author: ChristianKl 14 February 2014 12:01:53AM 2 points [-]

Of course, society as a whole should (and does) work on both of these things. But one individual can really only pick one to make a sizable impact -- or at the very least, one at a time. Which do you guys think may be more effective to work on?

The core question is: "What kind of impact do you expect to make if you work on either issue?"

Do you think there work to be done in the space of solar power development that other people than yourself aren't effectively doing? Do you think there work to be done in terms of better judgment and decision-making that other people aren't already doing?

we need to develop solar power or whatever else before all the oil and coal run out,

The problem with coal isn't that it's going to run out but that it kills hundred of thousands of people via pollution and that it creates climate change.

I know CFAR is working on that sort of thing, but I'm talking about more genetic or neurological changes)

Why? To me it seems much more effective to focus on more cognitive issues when you want to improve human judgment. Developing training to help people calibrate themselves against uncertainty seems to have a much higher return than trying to do fMRI studies or brain implants.

Comment author: ricketybridge 14 February 2014 01:26:51AM 0 points [-]

The core question is: "What kind of impact do you expect to make if you work on either issue?"

Do you think there work to be done in the space of solar power development that other people than yourself aren't effectively doing? Do you think there work to be done in terms of better judgment and decision-making that other people aren't already doing?

I'm familiar with questions like these (specifically, from 80000 hours), and I think it's fair to say that I probably wouldn't make a substantive contribution to any field, those included. Given that likelihood, I'm really just trying to determine what I feel is most important so I can feel like I'm working on something important, even if I only end up taking a job over someone else who could have done it equally well.

That said, I would hope to locate a "gap" where something was not being done that should be, and then try to fill that gap, such as volunteering my time for something. But there's no basis for me to surmise at this point which issue I would be able to contribute more to (for instance, I'm not a solar engineer).

To me it seems much more effective to focus on more cognitive issues when you want to improve human judgment. Developing training to help people calibrate themselves against uncertainty seems to have a much higher return than trying to do fMRI studies or brain implants.

At the moment, yes, but it seems like it has limited potential. I think of it a bit like bootstrapping: a judgment-impaired person (or an entire society) will likely make errors in determining how to improve their judgment, and the improvement seems slight and temporary compared to more fundamental, permanent changes in neurochemistry. I also think of it a bit like people's attempts to lose weight and stay fit. Yes, there are a lot of cognitive and behavioral changes people can make to facilitate that, but for many (most?) people, it remains a constant struggle -- one that many people are losing. But if we could hack things like that, "temptation" or "slipping" wouldn't be an issue.

The problem with coal isn't that it's going to run out but that it kills hundred of thousands of people via pollution and that it creates climate change.

From what I've gathered from my reading, the jury is kind of out on how disastrous climate change is going to be. Estimates seem to range from catastrophic to even slightly beneficial. You seem to think it will definitely be catastrophic. What have you come across that is certain about this?

Comment author: ricketybridge 13 February 2014 06:53:03PM 7 points [-]

I agree. Getting downvoted feels bad man, no matter the reason.

Comment author: ricketybridge 13 February 2014 03:03:45AM 1 point [-]

Since people were pretty encouraging about the quest to do one's part to help humanity, I have a follow-up question. (Hope it's okay to post twice on the same open thread...)

Perhaps this is a false dichotomy. If so, just let me know. I'm basically wondering if it's more worthwhile to work on transitioning to alternative/renewable energy sources (i.e. we need to develop solar power or whatever else before all the oil and coal run out, and to avoid any potential disastrous climate change effects) or to work on changing human nature itself to better address the aforementioned energy problem in terms of better judgment and decision-making. Basically, it seems like humanity may destroy itself (if not via climate change, then something else) if it doesn't first address its deficiencies.

However, since energy/climate issues seem pretty pressing and changing human judgment is almost purely speculative (I know CFAR is working on that sort of thing, but I'm talking about more genetic or neurological changes), civilization may become too unstable before it can take advantage from any gains from cognitive enhancement and such.On the other hand, climate change/energy issues may not end up being that big of a deal, so it's better to just focus on improving humanity to address other horrible issues as well, like inequality, psychopathic behavior, etc.

Of course, society as a whole should (and does) work on both of these things. But one individual can really only pick one to make a sizable impact -- or at the very least, one at a time. Which do you guys think may be more effective to work on?

[NOTE: I'm perfectly willing to admit that I may be completely wrong about climate change and energy issues, and that collective human judgment is in fact as good as it needs to be, and so I'm worrying about nothing and can rest easy donating to malaria charities or whatever.]

Comment author: Creutzer 13 February 2014 12:04:32AM 0 points [-]

SSRIs, for example, aren't supposed to do anything more than make you feel not completely miserable and/or freaked-out all the time. They are generally known to not actually make you happy and to not increase one's capability for enjoyment. If you are on one, and if that's a problem, you might actually want to look at something more stimulant-like, i.e. Bupropion. (There isn't really another antidepressant that does this, and it seems unlikely you'll manage to convince your psychiatrist to prescribe e.g. amphetamines for depression, even though they can work.)

And then there is, of course, all sorts of older and "dirtier" stuff, with MAOI's probably being something of a last resort.

Comment author: ricketybridge 13 February 2014 12:32:15AM 0 points [-]

Yeah, that accurately describes their effect on me.

I used to be on Buproprion, but it had unpleasant physical effects on me (i.e. heart racing/pounding, which makes sense, given that it's stimulant-like) without any noticeable mood effects. I was quite disappointed, since a friend of mine said he practically had a manic episode on it. However, I took it conjunction with an SNRI, so maybe that wouldn't have happened if I'd just taken it on its own.... Idk.

I'm actually surprised my psychiatrist hasn't recommended an MAOI to me in that case, since she freaks the hell out when I say I'm suicidal, and I've done so twice. I'll put MAOIs at the bottom of my aforementioned new to-do list. :)

Comment author: btrettel 12 February 2014 11:03:45PM 0 points [-]

If there are tons more different medications to try as you assert, none of my psychiatrists seem to know about it.

Wikipedia lists many. I count 21 categories alone. I would suggest reading at least a bit about how these drugs work to get some indication of what could work better. Then, you can go to your psychiatrist and discuss what you've learned. Something outside of their standard line of treatment may be unfamiliar to them, but it may suit you better.

For my last antifungal treatment, I specifically asked for something different from what I had used before and I provided a list of antifungal meds I tried, all of which were fairly standard. My doctor spent a few minutes doing some searches on their computer and came back with what ultimately worked.

Comment author: ricketybridge 12 February 2014 11:31:26PM 0 points [-]

Huh, interesting. Up-managing one's doctor seems frowned upon in our society -- since it usually comes in the form of asking one's doctor for medications mentioned in commercials -- but obviously your approach seems much more valid. Kind of irritating, though, that doctors don't appear to really be doing their job. :P

The exchange here has made me realize that I've actually been skipping my meds too often. Heh.... :\ So if I simply tighten that up, I will effectively increase my dosage. But if that doesn't prove to be enough, I'll go the route you've suggested. Thanks! :)

Comment author: Emile 12 February 2014 10:15:42PM 4 points [-]

EA is Effective Altruism.

Comment author: ricketybridge 12 February 2014 10:31:10PM 0 points [-]

Ah, thanks. :)

Comment author: Slackson 12 February 2014 08:49:54AM 4 points [-]

If I live forever, through cryonics or a positive intelligence explosion before my death, I'd like to have a lot of people to hang around with. Additionally, the people you'd be helping through EA aren't the people who are fucking up the world at the moment. Plus there isn't really anything directly important to me outside of humanity.

Parasite removal refers to removing literal parasites from people in the third world, as an example of one of the effective charitable causes you could donate to.

Comment author: ricketybridge 12 February 2014 09:23:06PM 0 points [-]

EA? (Sorry to ask, but it's not in the Less Wrong jargon glossary and I haven't been here in a while.)

Parasite removal refers to removing literal parasites from people in the third world

Oh. Yes. I think that's important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn't flawless, I may end up focusing on something more immediate. Not sure yet.

View more: Next