Procedural knowledge gap: public key encryption

3 Solvent 12 January 2012 07:35AM

In the spirit of uncovering procedural knowledge gaps, I'd like to know how to use public key encryption.

Is there some website which generates public and private keys, and lets you encode and decode according to those keys?

I'd love if there was some way I could send my encoded text via IM or email, and just decode it like we do with rot13. Is there some way of doing this?

Currently, I encrypt things using TrueCrypt, but there's no way that I can communicate with people with that without securely establishing a common key beforehand.

Does anyone know how to do this?

A variant on the trolley problem and babies as unit of currency

9 Solvent 08 January 2012 08:13AM

I was discussing utilitarianism and charitable giving and similar ideas with someone today, and I came up with this hybrid version of the trolley problem, particularly the fat man variation, and the article by Scott Siskind/Yvain about using dead children as a unit of currency. It's not extremely original, and I'd be surprised if no-one on LW had thought of it before.

You are offered a magical box. If you press the button on the box, one person somewhere in the world will die, you get $6000, and $4,000 is donated to one of the top rated charities on GiveWell.org. According to the $800 per life saved figure, this charity gift would save five lives, which is a net gain of four lives and $6,000 to you. Is it moral to press the button?

All of the usual responses to the trolley problem apply. To wit: It's good to have heuristics like "don't kill." There's arguments about establishing Schelling points with regards to not killing people. (This Schelling point argument doesn't work as well in a case like this, with anonymity and privacy and randomization of the person who gets killed.) Eliezer argued that for a human, being in the trolley problem is extraordinarily unlikely, and he would be willing to acknowledge that killing the fat man would be appropriate for an AI in the situation to do, but not a human.

There's also lots of arguments against giving to charity, too. See here for some discussion of this on LessWrong.

I feel that the advantage of my dilemma is that in the original extreme altruism faces a whole lot of motivated cognition against it, because it implies that you should be giving much of your income to charity. In this dilemma, you want the $6,000, and so are inclined to be less skeptical of the charity's effectiveness.

Possible use: Present this first, then argue for extreme altruism. This would annoy people, but as far as I can tell, pretty much everyone gets defensive and comes up with a rationalization for their selfishness when you bring up altruism anyway.

What would you people do?

EDIT: This $800 figure is probably out of date. $2000 is probably more accurate. However, it's easy to simply increase the amount of money at stake in the thought experiment.

Edit 2: I fixed some swapped-around values, as kindly pointed out by Vaniver.

An argument that animals don't really suffer

6 Solvent 07 January 2012 09:07AM

I ended up reading this article about animal suffering by this Christian apologist called William Craig. Forgive the source, please.

 

 

In his book Nature Red in Tooth and Claw, Michael Murray explains on the basis of neurological studies that there is an ascending three-fold hierarchy of pain awareness in nature:i

Level 3: Awareness that one is oneself in pain
Level 2: Mental states of pain
Level 1: Aversive reaction to noxious stimuli

Organisms which are not sentient, that is, have no mental life, display at most Level 1 reactions. Insects, worms, and other invertebrates react to noxious stimuli but lack the neurological capacity to feel pain. Their avoidance behavior obviously has a selective advantage in the struggle for survival and so is built into them by natural selection. The experience of pain is thus not necessary for an organism to exhibit aversive behavior to contact that may be injurious. Thus when your friend asks, “If you beat an animal, wouldn't it try to avoid the source of pain so that way 'it' wouldn't suffer? Isn't that a form of 'self-awareness?'," you can see that such aversive behavior doesn’t even imply second order pain awareness, much less third order awareness. Avoidance behavior doesn’t require pain awareness, and the neurological capacities of primitive organisms aren’t sufficient to support Level 2 mental states.

Level 2 awareness arrives on the scene with the vertebrates. Their nervous systems are sufficiently developed to have associated with certain brain states mental states of pain. So when we see an animal like a dog, cat, or horse thrashing about or screaming when injured, it is irresistible to ascribe to them second order mental states of pain. It is this experience of animal pain that forms the basis of the objection to God’s goodness from animal suffering. But notice that an experience of Level 2 pain awareness does not imply a Level 3 awareness. Indeed, the biological evidence indicates that very few animals have an awareness that they are themselves in pain.

Level 3 is a higher-order awareness that one is oneself experiencing a Level 2 state. Your friend asks, “How could an animal not be aware of their suffering if they're yelping/screaming out of pain?" Brain studies supply the remarkable answer. Neurological research indicates that there are two independent neural pathways associated with the experience of pain. The one pathway is involved in producing Level 2 mental states of being in pain. But there is an independent neural pathway that is associated with being aware that one is oneself in a Level 2 state. And this second neural pathway is apparently a very late evolutionary development which only emerges in the higher primates, including man. Other animals lack the neural pathways for having the experience of Level 3 pain awareness. So even though animals like zebras and giraffes, for example, experience pain when attacked by a lion, they really aren’t aware of it.

To help understand this, consider an astonishing analogous phenomenon in human experience known as blind sight. The experience of sight is also associated biologically with two independent neural pathways in the brain. The one pathway conveys visual stimuli about what external objects are presented to the viewer. The other pathway is associated with an awareness of the visual states. Incredibly, certain persons, who have experienced impairment to the second neural pathway but whose first neural pathway is functioning normally, exhibit what is called blind sight. That is to say, these people are effectively blind because they are not aware that they can see anything. But in fact, they do “see” in the sense that they correctly register visual stimuli conveyed by the first neural pathway. If you toss a ball to such a person he will catch it because he does see it. But he isn’t aware that he sees it! Phenomenologically, he is like a person who is utterly blind, who doesn’t receive any visual stimuli. Obviously, as Michael Murray says, it would be a pointless undertaking to invite a blind sighted person to spend an afternoon at the art gallery. For even though he, in a sense, sees the paintings on the walls, he isn’t aware that he sees them and so has no experience of the paintings.

Now neurobiology indicates a similar situation with respect to animal pain awareness. All animals but the great apes and man lack the neural pathways associated with Level 3 pain awareness. Being a very late evolutionary development, this pathway is not present throughout the animal world. What that implies is that throughout almost the entirety of the long history of evolutionary development, no creature was ever aware of being in pain.

He continues the argument here.

How decent do you think this argument is? I don't know where to look to evaluate the core claim, as I know very little neuroscience myself. I'm quite concerned about animal suffering, and choose to be vegetarian largely on the basis of that concern. How much should my decision on that be affected by this argument?

EDIT: David_Gerard wins by doing the basic Google search that I neglected. It seems that the argument is flawed. Particularly, animals apart from primates have pre-frontal cortexes. 

I'd like to talk to some LGBT LWers.

4 Solvent 30 December 2011 10:39AM

When _ozymandias posted zir introduction post a few days ago, I went off and binged on blogs from the trans/men's rights/feminist spectrum. I found them absolutely fascinating. I've always had lots of sympathy for transgendered people in particular, and care a lot about all those issues. I don't know what I think of making up new pronouns, and I get a bit offput by trying to remember the non-offensive terms for everything. For example, I'm sure that LGBT as a term offends people, and I agree that lumping the T with the LGB is a bit dubious, but I don't know any other equivalent term that everyone will understand. I'm going to keep using it.

However, I don't currently know any LGBT people who I can talk to about these things. In particular, the whole LGBT and feminist and so on community seems to be prone to taking unnecessary offense, and believing in subjectivism and silly things like that.

So I'd really like to talk with some LWers who have experience with these things. I've got questions that I think would be better answered by an IM conversation than by just reading blogs.

If anyone wants to have an IM conversation about this, please message me. I'd be very grateful.

EDIT: Wow, that's an amazing response. Thank you all for your kind offers. I'll talk to as many of you as I can get around to.

Is quantum physics (easily?) computable?

4 Solvent 18 October 2011 08:44AM

So I've been trying to read the Quantum Physics sequence. I think I've understood about half of it- I've been rushed, and haven't really sat down and worked through the math. And so I apologize in advance for any mistakes I make here.

It seems like classical mechanics with quantized time is really easy to simulate with a computer: every step, you just calculate force, figure out where velocity is going, then add the change in position to the new position.

Then when you change to relativity, it seems like it's suddenly a lot harder to implement. Whereas classical mechanics are easy on a computer, it seems to me that you would have to set up a system where the outcomes of relativity are explicitly stated, while the classical outcomes are implicit.

The same thing seems to occur, except more, with quantum physics. Continuous wave functions seems to be far harder than discrete particles. Similarly, the whole thing about "no individual particle identity" seems more difficult, although as I think of it now, I suppose this could be because the whole concept of particles is naive.

It doesn't seem like the computation rules simply get harder as we learn more physics. After all, trying to do thermal physics got a lot easier when we started using the ideal gas model.

Also, it's not just that ever improving theories must be ever more difficult to implement on a computer. Imagine that we lived inside Conway's Game of Life. We would figure out all kinds of high level physics, which would be probably way more complex than the eventual B3/S23 which they would discover.

It feels like the actual implemented physics shouldn't much affect how computation works. After all, we live in a quantum universe and classical physics is still simpler to compute.

Is there any value to this speculation?

The effects of religion (draft)

-5 Solvent 29 September 2011 01:11AM

 

I've written this essay about the effects of religion. I plan to post it in the main section. However, in my year or so of lurking here I've noticed that most of the first main posts people create are either irrelevant or bad, and end up down voting the poster to oblivion. To this aim, I first post it here, for your critique, and also to tell me if you think it's appropriate to post in the main section of LessWrong.

 

 

The effects of religion

In the atheist community, it's held as pretty much a self-proving truth thar religion is a bad thing. I have attempted to produce a taxonomy of the effects of religion, both positive and negative. This is written based on my personal experience of the Christian church, and on whatever actual facts I could find.

So. My list of the external effects of religion. These are given as both comparisons to normal people, who don't think much about religion or effective charity, and comparisons to LessWrongians.

It's worth pointing out that western society seems to have a lot of cached thoughts from Christianity. Normal people are often not Christian, but casually believe a lot of its teachings. As a result, many of the negative effects of Christianity affect non-Christians who don't pay particular attention to their beliefs too.


The purpose of this essay is to determine if LessWrong should actively evangelize against religion. If we really wanted to, we could probably do so fairly easily. I conclude that it's probably not worthwhile doing so.


Charitable giving

The Christians I know all seem to give far more to charity, both in terms of money and time, than average people. Eliezer pointed this out somewhere, but I can't seem to find a reference. The giving probably isn't quite optimized, but it's a far cry better than nothing. A large proportion of the charity which the Christians I know support seems highly effective, and very little of it is optimized for evangelism alone.

It could be that by co-incidence I just happen to know particularly effective Christians for some reason.

It's worth considering the degree to which Christians and atheists disagree on what charities are worth supporting. The only things Christians support which atheists wouldn't as much are things like school chaplains, giving Bibles, and protesting for various Christian issues like opposing gay marriage and abortion. Ridiculous amounts are certainly spent on pointless lobbying, but as a proportion of total Christian charity giving, it can't be that massive.

I don't think there are any atheist organizations which provide enough peer pressure to have the members give as much, and I don't think many people would be as generous over the long term alone as compared to in church groups. So it's an open question to me as to whether your average LessWrongian would do more or less good via charity than your average Christian.

Aside from this, I think many Christians are fairly good at making an effort to be casually kind to those around them: at the very least they aren't as casually cruel as normal people can be. I expect that LessWrongians would be about as good as Christians at this.


Time and money spent on religion

Religious people spend time and money on religious materials, prayer, churches, and so on. The effect of this is probably neutral compared to what a normal person would be doing, as prayer, theology books and such seem to be fairly ineffective but probably not downright negative things. Again, I don't really know what normal people do with their time, but I don't guess that it would be any worse than anything religious. However, this is something which LessWrongians would surely do better at, as they could hopefully spend their time learning useful things or hopefully entertaining themselves in some more meaningful or effective ways.

 

No cryonics, attitude to death

One of the most important messages I've gotten out of Less Wrong and similar sources is that death is bad. However, religious people disagree, as a result of their belief in an afterlife. I don't know how much this actually matters. Religious people are highly unlikely to sign up for cryonics. However, according to the survey, more LessWrongians are theist than have signed up for cryonics, so I don't think this effect matters much.

For some bizarre reason, all the issues like abortion, the death penalty, bombing civilians in Arab countries, and euthanasia, where belief in a Christian afterlife would seem to me to encourage a left-leaning viewpoint, are also issues where the western church leans to the right. (For example, I'm not quite sure why anyone who believed in hell for nonbelievers would support war against Muslims.) So all these issues where you'd think their religious beliefs would throw them off, it seems more like their conservatism screws up their reasoning. Correlation not causation.

I would also expect most theists to value their own lives, and those of people in their religion, far less highly than those of people with different or no religions. This would be a minor problem, however it doesn't seem to come up at all in the real world.

 

Practice and or condoning of irrationality

Practicing things like faith, believing in things you have little evidence for, and the above being perceived as a sign of virtue is bad for rationality in other areas of your life. In this section I'm not talking about actual incorrect statements made by the religion. There's nothing in Christianity that explicitly says that, for example, wishing for things and believing you'll get them means that you will. However, Christian thinking implicitly gets your mind used to a world with meaning, sense, and your belief as a determining factor.

In particular, the mind projection fallacy is encouraged by religion. When you believe that there is an omniscient being who controls life, you're encouraged to see patterns where there are none, and see God's character in random events. This is bad.
Also, the central thesis of reductionism, that everything is comprised of ontologically simple elements, is contradicted by religion.
In Christianity at least, you're told that you need sufficient faith in order to successfully pray. This causes lots of rationality based problems. It's one of the axioms of LessWrong-style rationality that what you think does not affect the world. If you're religious, you don't believe that. It leads to things like "believing as hard as you can" and such.

Finally, religion encourages the just world hypothesis, as a result of belief in a benevolent creator. In Christianity, you can always say "But God made it that way" if you support something. This isn't encouraged by the Bible at all, but people still do it.

 

Actual factual errors in the religion

Obviously, people who believe in a revealed religion are going to walk around wrong about a lot of factual matters. So how many of these actually matter? Things like the power of prayer probably don't matter,as all the Christians I've ever met seem to consume medicine and make health decisions just like the next person.

Believing in creation has a few effects. Firstly, it encourages people to believe that we are well designed. This makes them less likely to accept the idea of cognitive biases. It makes them skeptical of evolutionary psychology, which is bad.

People may get some silly moral ideas, like opposition to homosexuality. But this is decreasing in prevalence, for example as shown by the existence of Christian support for gay marriage and abortion.

 

Deontology

Consequentialism is pretty much common sense. However, most religions are phrased in terms of deontology. (This is actually a problem that Christianity doesn't need to have: Paul's comment "Everything is permissible, but not everything I'd beneficial" seems to be as clearly in favor of consequentialism as you're likely to get. Nevertheless, very few Christians seem to get this.) This frequently results in stupid beliefs, like a support of the death penalty, and things such as drug use being "just wrong". However, most normal people seem to default to deontology anyways, so it's hard to say that religion directly causes this problem.

 

Social pressures

Most Christians would be upset by how frankly LessWrong calls them idiots. As a result, they don't get many of the positive benefits of reading LW type materials. More generally, religious people are going to dislike and mistrust science to a greater extent. There's a lot of benefit to be gained by understanding and trusting science, for example with issues such as climate change.

They're also going to be discouraged from hanging around the intellectual types of people who are otherwise good for you. If you only read Christian media, you're exposed to a far lesser range of media, and you're more susceptible to the general conservative bias which pervades Christianity.

 

The effect of religious community

Many studies have shown that religiosity correlates with happiness and health. LessWrong seems to have a general consensus that this is as a result of the community created by a religion. Compared to the default position of a normal person, it's way better to be a churchgoer. It remains to be seen if LessWrong groups can be this effective, even though cases such as the New York Less Wrong group seem to be working fairly nicely from what I've heard.

Established religions have an extreme advantage over new organizations such as LessWrong chapters. To start with, they are already large and powerful. There aren't many places where there are enough rationalists to start something like the New York Less Wrong group (notice it's in New York). The people who are drawn to LessWrong are possibly the wrong demographic proportions to create lasting communities, particularly with an excess of young males. It's been previously pointed out that getting girls to show up is essential for LessWrong meetups and communities. So it's hard to get rationalist communities going which can rival religious communities' consistency.

There aren't many organizations like the general Christian church, which provide such a wide ranging base of peer support.


Conclusion

Religion seems to have a variety of positive and negative effects. Its most positive effects are encouraging charity and providing a stable community. The most negative effects are a general mistrust of science, and the various irrationalities which are applauded by religion.

And so, what should a LessWrongian do with respect to religious people? I think we should be polite to them. Religion doesn't have a bad enough effect to justify arguing against it. If by some chance you do convince them out of their faith, the chance that they won't just default to normal person mode and keep the cached thoughts of their religion is fairly low.

Additionally, by arguing with religious people, you make them distrust science and intellectuals and rationality. This is significantly more of a problem than the religion itself. Because of "arguments as soldiers" the religious person might start always looking out for cases of science being wrong, and also never listen to it, because if you listen to science, you're betraying your faith. This is very, very bad, far worse than just compartmentalizing your beliefs.

I recommend a policy of "raising the sanity waterline". Just casually improving everyone's rationality would be a far more effective goal. It doesn't look like being religious significantly affects your mental abilities in other fields: look at the proportion of religious Nobel prize winners.

There's little upside in specifically attempting to evangelize theists, so I suggest we shouldn't.

 

 

Your favorite pdfs?

4 Solvent 18 September 2011 11:18AM

I just got an iPad, which means that for the first time in my life it's convenient for me to read long documents from the internet. I've often looked at things and thought, oh, if I ever have some capability to read that, I should. However, I haven't made a list.

So. What documents in pdf form, or long websites, should I read now that I can? Any recommendations?

 

EDIT: Thanks for the advice. I'm particularly excited to read Godel, Escher, Bach. I've also downloaded the ePub of the Sequences, and I'm going to certainly look at Strategy of Conflict. Thank you very much for the links.

For fiction: How could alien minds differ from human minds?

9 Solvent 21 August 2011 10:37AM

One of the most important points raised by the sequences is that not all minds are like humans. In quite a few places, people have discussed minds with slight changes from human minds, which seem altogether different. However, a lot of this discussion has been related to AI, as opposed to minds created by evolution. I'm trying to think of ways that minds which evolved, and are effective enough to start a civilization, could differ from humans'.

Three Worlds Collide would seem like an excellent starting point, but isn't actually very useful. As far as I recall, the Babyeaters might have learned their baby eating habits as a result of societal pressure. The main difference in their society seemed to be the assumption that people who disagreed with you were simply mistaken: this contrasts to humans' tendency to form rival groups, and assume everyone in the rival groups is evil. The Super-Happies had self modified, and so don't provide an example of an evolved mind.

So here are my ideas so far.

  • A species could have the same neural pathways for wanting and liking. This would lead to far less akrasia.
  • A species could have a different set of standards for boredom. This seems to be one of the most precarious values in the human mind.
What other ways can you think of?

View more: Prev | Next