Posts

Sorted by New

Wiki Contributions

Comments

Sean_C.15y00

Eleizer wasn't the first to think of this sort of thing:
http://en.wikipedia.org/wiki/A_Modest_Proposal

Sean_C.15y40

Economic Weirdtopia: There is no economy. Everyone lives a self sufficient existence on isolated farms. Think Solaria.

Sean_C.15y10

And the mapping of attraction to those noumena is entirely subjective! Like in another Elizer Yudkowsky essay: http://lesswrong.com/lw/tn/the_true_prisoners_dilemma/

Sean_C.15y30

One common answer to the question "What will we do in the future when we've fixed all that is wrong with today" is "How the hell should I know?"

For example, imagine our neolithic ancestors asking each other the same question. "What will they do in the future when they don't have to worry about food, shelter, or even disease?" I think they could have imagined some things; "They'll make more complicated art." "They'll have more complicated sports."

But I don't they they would have imagined full time mathematicians, or video games, or TV, or even books. "Many many people will sit around and talk to each other while someone else gives them food and shelter in return." Unthinkable! Absurd!

So I would say that the technology of the future, insofar as it's impossible for us to imagine in detail, also makes the lifestyle of the future impossible to imagine in detail. Hence, the Singularity. Oh, and if you listen to those folks, they'll say that we'll merge with the AI's, so all of that 'automatic' engineering and whatnot will actually be performed by some sort of version of 'us'.

Sean_C.16y00

Warren Buffet uses the 'birth points' idea ("Ovarian Lottery" in his terminology) in a great thought experiment for developing his morality and ethics.

http://rationalangle.blogspot.com/2007/12/warren-buffett-and-hillary-clinton-at.html

At the end he puts a political slant on it, but I've read other instances where he puts it into larger terms.

Sean_C.16y110

I heard a funny story once (online somewhere, but this was years ago and I can't find it now). Anyway I think it was the psychology department at Stanford. They were having an open house, and they had set up a PD game with M&M's as the reward. People could sit at either end of a table with a cardboard screen before them, and choose 'D' or 'C', and then have the outcome revealed and get their candy.

So this mother and daughter show up, and the grad student explained the game. Mom says to the daughter "Okay, just push 'C', and I'll do the same, and we'll get the most M&M's. You can have some of mine after."

So the daughter pushes 'C', Mom pushes 'D', swallows all 5 M&M's, and with a full mouth says "Let that be a lesson! You can't trust anybody!"

Sean_C.16y10

The classic example is picking people for a sports team. The difference in performance from the very top superstars is much greater than the guys who just make the team.

So the superstars are obvious choices, but there isn't a whole lot of difference between the guys who just make the team and the guys who just don't.

Sean_C.16y90

Animal trainers have this problem all the time. Animal performs behavior 'x' gets a reward. But the animal might have been doing other subtle behaviors at the same time, and map the reward to 'y'. So instead of reinforcing 'x', you might be reinforcing 'y'. And if 'x' and 'y' are too close for you to tell apart, then you'll be in for a surprise when your perspective and context changes, and the difference becomes more apparent to you. And you find out that the bird was trained to peck anything that moves, instead of just the bouncy red ball or something.

Psychologists have a formal term for this but I can't remember it, and can't find it on the internet, I'm sorry to say.

Come to think, industry time-and-motion people suffer the same problem.

Sean_C.16y30

I think the work you're looking for is 'truthiness'; http://en.wikipedia.org/wiki/Truthiness

Sean_C.16y60

Isaac Asimov said it well: "Never let your morals get in the way of doing the right thing."

Load More