Now you are getting it copletely wrong. You can“t add up harm on spec dust if it is happening to different people. Every individual has a capability to recover from it. Think about it. With that logic it is worse to rip a hair from every living being in the universe than to nuke New York. If people in charge reasoned that way we might have harmageddon in no time.
That's ridiculous. So mild pains don't count if they're done to many different people?
Let's give a more obvious example. It's better to kill one person than to amputate the right hands of 5000 people, because the total pain will be less.
Scaling down, we can say that it's better to amputate the right hands of 50,000 people than to torture one person to death, because the total pain will be less.
Keep repeating this in your head(see how consistent it feels, how it makes sense).
Now just extrapolate to the instance that it's better to have 3^^^3 people have dust specks in their eyes than to torture one person to death because the total pain will be less. The hair-ripping argument isn't good enough because pain.[ (people on earth) * (pain from hair rip) ] < pain.[(people in New York) * (pain of being nuked) ]. The math doesn't add up in your straw man example, unlike with the actual example given.
As a side note, you are also appealing to consequences.
It would be really helpful if the author explained what his point with this story was, beacuse there are several interpretations. I guess that the moral is that Eliezer Yudkowsky is afraid of AI.
The point is that to an AI, we are but massive, stupid beings who are attempting to teach them minor symbols with massive overuse of resources(that few lines of code to define "rock" could be used by a sufficiently powerful UFAI to, say, manufacture nukes).
I'm there with one other person. Look to the lesswrong Singapore google group for any future updates. https://groups.google.com/forum/m/#!topic/lesswrong-singapore/cXtHTMQO4xw
Finally, a Singapore meetup! Will definitely be there.
For me, the problem with this is that if I'm speaking to an autistic person(and a very large number of LWers identify themselves as on the autistic spectrum), they tend to use literal meanings very often. In fact, some of them(including me) get offended or confused when they say something literal and it is interpreted as sarcastic or subtext.
Suppose I am speaking to an autistic person, and he says, "I am 87% confident that X is true." The issue with this statement is that a lot of people use this sort of statement in a metaphorical sense(ie. they just pull the number out of their butt to make it oddly specific and get a cheap laugh) but an autistic or rationality-trained person may literally mean that they are 87% sure it is true, especially if they are good at measuring their own confidence levels. In this case, the usual situation- the number being picked randomly - is false.
There are also, however, a large number of statements that are almost always meant sarcastically or in a non-literal way. The statement "I, for one, welcome our new alien overlords" is almost always sarcastic as it is 1) invoking a well-known meme which is intended to be used in this manner and 2) it is extremely unlikely that the person I am speaking to is actually someone who wants aliens to take over the world. These statements are, for want of a better word, "cached non-literal statements"(as in, it is an automatic thought that these statements are not literal), or CNLS for short.
It might be useful to append the guideline "All statements have a probability of being literal that is worth considering, except in the case of CNLSes. This probability is adjusted up if the person you are speaking to is known for being extremely literal and adjusted down if they are known for using figurative speech(although that last sentence should be fairly obvious, I throw it in for the sake of completeness)" to your thesis.
This actually got me thinking if there is a methodical, objective and accurate way to find out if someone's statement is literal or not, perhaps by measuring their posture, tone of voice. The only difficulty is to try to weasel some quantifiable data out of context. If it can be done, that would be a great resource to people who have trouble understanding the non-literal meanings of statements everywhere.
Thank you for reminder, I'm done with procrastinating for today
No problem, and I hope this post taught you how to work better and learn better. If you have problems with procrastination, you can try programs like Beeminder, or simply have a friend act as a watcher to ensure you get your work(or your three new things) done for the day, week or month.
Learn Three Things Every Day
In the Game of Thrones series, there is an ongoing side plot in which a character is trained by a secretive organization to become an assassin. As part of her training, one of the senior assassins demands that she report to him three new things she has learnt every day. by making a natural inference from the title of the article, you might infer or assume that I am going to suggest that you do the same. I am, but with a crucial difference.
You see, my standards are higher than the Faceless Men. Instead of filling up your list of learnt things with only marginally useful things like gossip or other insignificant things, I am going to take it up a notch and demand that you learn three USEFUL things a day. This is, of course, an entirely self-enforced challenge, and I'll let you decide on the definition of useful. Personally, I use the condition of [>50% probability that X will enrich my life in a significant way], but if you want, you can make up your own criteria for "useful".
This may seem trite or useless, or even obvious(if you're an eager and fast learner, like most LWers). Now stop and think hard. For the entire of the past 30 days, have you ever had a day or two where you just slacked off and didn't learn much? Maybe it was New Year's Day, or your birthday, and instead of learning you decided to spend the whole day partying. Perhaps it was just a lazy Sunday and you couldn't be bothered to learn something and instead just spent the day playing video games or mountain skiing(although there are useful things to be learnt from those, too) or whatever you like to do in your spare time.
I haven't taken an official survey, but my belief(and do correct me if I am very wrong about this) is that on average there's at least one day in thirty in which you did not learn thirty new, useful things. I would consider that day as pretty much wasted from a truth-seeker's point of view. You did not move forward in your quest for knowledge, you did not sharpen your rationality skills(and they always need sharpening, no matter how good you are) and you did not become stronger mentally. That's 12 days in a year, which is more than enough for the average LWer to pick up at least one new skill: say, learning about game theory, to pick a random example. In that year, you have had a chance to gain the knowledge of game theory, and you threw it away.
The point of this exercise is not to make you sweat and do a "mental workout" every day. The point is to prevent days that are wasted. There is a nearly infinite amount of knowledge to collect, and we do not have nearly infinite time. Maybe it's just my Asian mentality speaking here, but every second counts and you are in effect racing against time to gain as much knowledge as possible and put it to good use before you die.
When doing this, you are not allowed to merely work on your projects, unless they also teach you something. If you are a non-programmer, and you begin learning Python, that's a new thing. If you're already fluent in Python, and you program in Python, that's not counted. With one exception: if you learn something through programming(maybe you thought up a nifty new way to sanitize user inputs while working on a database) then that counts. If you're a writer, and you write, that doesn't count. Unless, of course, by writing you learn things about worldbuilding, or plot development, or character development, that you didn't know before. Yes, this counts, even though it's not directly rationality-related, because it enriches your life: it helps you achieve your writing goals(that's also a good condition for usefulness, and is a good example of instrumental rationality).
Today, I've learn about the concept of centered worlds, I have learnt about the policy of indifference in similar worlds and I have learnt the technique of "super-rationality" as a means to predict the behavior of other agents in acausal trade. What have you learnt today?
Do it now. Don't wait, or you will waste this day, which is 86400 countable seconds in which to learn things. In fact, I've given you a head start today, because you can count this article in your list of learnt things.
Good luck to you. Let's learn together.
[This is my first post on LW and I hope that I taught you something interesting and useful. Again, I'm new to posting, so if I violated some unspoken rule of etiquette, or if you think this post is obvious and shitty, feel free to vote me down. But do leave a comment explaining why you did, so I can add it to my list of learnt things.]
A cult usually gives their members certain values. The members have those values the get into conflict with individuals with whom they have relationships with don't share those values. Then the cult member is encouraged to cut of those relationships because they hold back the cult member.
The social default is that you don't cut of relations with members of your family even if you don't draw value from those relationships. Groups that do encourage members to cut of their family connection then get seen by other family members as cults.
Members of a cult see other members of the cult has being high value relationships and relationships with outsiders as low value. That leads to group think and not being grounded in society as a whole.
The rationalist who cuts of relationships with everybody who he doesn't consider to be a rationalist falls under this case if you take the outside view.
with the implication that this is a generally sensible thing to do.
It seems to me that it is.
That ignores the point I made. In the outside view it's cultish behavior to cut certain relationships just because they provide you no value. With the inside view you always can find reasons.
I don't want to say that you should never cut relationships but having a reluctance to do so or recommend others to do is good.
I'll just point out that I actively cut off relationships with people of no value before I read this. Therefore, your argument that non-cultists don't cut off relations with zero valu people is incorrect in at least one case and possibly more: as it is the core of your argument, your argument in at least one case and possibly more.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I don't think people have a right to lie to other people. I also can't understand why you would regret breaking up with someone so truth-averse and horrible.