Comment author: Larks 01 July 2015 01:00:36AM 8 points [-]

saving lives and saving souls are nearly equally important.

If souls actually exist (and could go to heaven and hell) then saving souls is far more important than saving lives! Your disagreement with them is surely not about relative importance, it is about ontology.

Comment author: A11AF82 01 July 2015 01:30:29PM -6 points [-]

If.

Comment author: A11AF82 29 June 2015 04:03:14PM 1 point [-]
Comment author: Eugine_Nier 25 March 2014 08:11:00AM 2 points [-]

Huh? Jack said that there two sides to the ledger with respect to tobacco. He didn't say which side would necessarily prevail in this case. Furthermore, there is no reason why the side that's stronger for one drug is necessarily stronger for another.

Comment author: A11AF82 25 March 2014 09:50:23AM -2 points [-]

Jack said that there two sides to the ledger with respect to tobacco

And I replied there were similarly two sides to the ledger with respect to many other drugs.

He didn't say which side would necessarily prevail in this case.

Neither did I.

Furthermore, there is no reason why the side that's stronger for one drug is necessarily stronger for another.

Are you saying that out of all existing non-legal drugs, not even one would have a similar profile to tobacco?

Comment author: Jack 25 March 2014 06:46:28AM 5 points [-]

A lot of industries are going to look really bad if you only score one side of the ledger. Given that a huge number of people continue to smoke and enjoy it, despite knowing the negative implications for their health it seems reasonable to assume that tobacco companies supply the world with a great deal of utility, in addition to the lung cancer.

Comment author: A11AF82 25 March 2014 08:01:57AM *  5 points [-]

This would likely be true of many other (hard) drugs if there had been a history of legally selling them instead of nipping their markets in the bud. In fact, this would probably be true of wireheading too if it was practical, and ultimately, orgasmium. Willing to bite that bullet?

Comment author: Will_Newsome 16 June 2013 07:59:24PM *  3 points [-]

So basically, my hypothesis is, the reason intelligent people are so often socially clumsy is because it's a facade, a self-imposed handicap they keep up because evolution has programmed us to have repulsion towards unfairly manipulating others. Because they can make others do anything, they choose to do nothing. This manifests as being easily led, a kind of "doormat", lacking their own will or ego, even.

This is more or less true of me, though I don't know why evolution is being singled out etiologically, it seems like even a blank slate mind could learn to be inhibited this way. Anyhow I avoid meeting people's eyes or generally looking at people's faces because my automatic inclinations are to nod along, smile, make them feel like I'm their ally &c., even when I don't actually agree with them or think what they're doing or saying is right. Like when someone tells a self-deprecating joke and they expect you to smile or laugh, or when they fish for a compliment, or when they tell you about something they think is important that you don't think is important. Those are obvious examples that everyone notices, but human conversation is chock full of subtler games that are harder to be reflective about and have bright line rules for. You either implicitly lie to them or you constantly disappoint them. This is extremely salient to me because I'm abnormally good at reading people's facial expressions. Not meeting their eyes and being generally evasive is a way to keep myself honest. I still stand by this decision, even if it means constantly handicapping my status, attractiveness, and generally my life. Integrity is important.

Comment author: A11AF82 19 June 2013 03:36:10AM 2 points [-]

Go one step farther. Do that compulsion to agree and be mellow with people you don't agree with mean you're a dishonest jerk who's trying to manipulate them, or does it mean you're not feeling comfortable with disagreeing with them (for instance because you feel like they'd reject you if you did, which might be painful, or because you do not want to hurt their feelings, or because you feel like you don't have the social status to do that). Don't necessarily assume you're evil.

For instance I know I hate lying mostly because I am feeling insecure enough to think I wouldn't get away with it. This stems from a difficulty to put myself in other's shoes. If I know how and what I lied about, then surely I can imagine many ways in which they'll eventually uncover my lie.

Another reason why I may come to dislike social relationships is because I harbor few illusions about human motivation and drives. Most interactions with people can be interpreted as manipulation to get your way, if you try hard enough to see it that way. My issue with that is, however, that I don't want to have such a relationship with others. I naively crave a natural, hassle-free relationship where I'm being liked and like others unconditionally. So whenever I think in terms of what strings I need to pull to move others, I feel bad about it because I don't want to have a relationship with puppets, I want to have a relationship with real people. Yet, I can't exactly believe relationships are magical in that way - nothing is for free or unconditional, and there are definite winning and losing moves in social relationships. So I'm torn between what I want (not over analyzing stuff and just getting along with people) and what I believe (that if I don't do that, then I may fail at being adequately social).

Comment author: CarlShulman 05 September 2012 03:22:58AM 2 points [-]

Transhumanism is a cause of some folk visiting here despite being less interested in epistemic rationality as such, so I would expect an enrichment of self-identified transhumanists anong the authors of badly received posts. It's also a political tribe for a number of people, so there is room for mind-killing.

However, I'd be curious to see your links to examples.

Comment author: A11AF82 05 September 2012 04:26:50AM *  3 points [-]

Quite a few associations rooted in transhumanism have attempted (whether they did so successfully is questionable) to distance themselves from the crazy-sounding (to a mainstream audience) plain description of their original goals and beliefs, in an effort to attract more and better quality funding and following (such as academia).

Compare :

Longecity, formerly The Immortality Institute

Humanity+, formerly The World Transhumanist Association

The Singularity Institute remains named so, but seems willing to follow suit.

I think I'm observing an emerging pattern, where several loaded topics such as transhumanism and cryonics have become much more controversial and unfashionable in places which previously championed them, and Lesswrong is no exception, as there's been concern that such topics may not have their place on a forum devoted to rationality.

You appear to express this connection (transhumanism being unfashionable) yourself in this sentence : Denotationally crazy political (namely, transhumanist) rhetoric.

Comment author: Slider 04 September 2012 08:03:23PM 3 points [-]

Downvoted for roundaboutness

*Pros for Up: Spreading rationality, positive news on progress *Pros for Down: conflictory use of strenght buildup and bypass, low focus on the title issue, post category confusion (is this an advert, news report, job offering or methodology query?), low topic cohesion: transhumanisms is a tack on

Comment author: A11AF82 05 September 2012 02:25:51AM 1 point [-]

Is it me or has transhumanism become a taboo word associated to low status crackpots around here?

Comment author: Eliezer_Yudkowsky 17 August 2012 12:23:56AM 3 points [-]

I strongly suspect that the primary result of such an algorithm would be very wide error bars on the timeline, and that it would indeed outperform most experts for this reason. You can't get water from a stone, nor narrow estimates out of ignorance and difficult problems, no matter what simple algorithm you use. Though I would be quite intrigued to be proven wrong about this, and I have seen Fermi estimates for quantities like e.g. the mass of the Earth apparently extract narrow and correct estimations out of the sums of multiple widely erroneous steps.

Comment author: A11AF82 17 August 2012 02:18:16AM *  6 points [-]

and I have seen Fermi estimates for quantities like e.g. the mass of the Earth apparently extract narrow and correct estimations out of the sums of multiple widely erroneous steps.

out of how many wrong/wide estimates using the same method?