Posts

Sorted by New

Wiki Contributions

Comments

samath1y63

Is this an accurate and helpful summary in layman's terms?

  • Training against an undesired behavior such as deception with a straightforward penalization approach is like giving the AI an instinctive aversion to it.
  • Such undesired behaviors would be useful in problems AIs will be asked to solve.
  • If an AI is smart enough, it will be able to translate some such problem to another domain where it lacks the instinct against deception, solve the problem there, and translate it back.
  • Once the AI notices this trick, it can overcome these aversions any time it wants.
samath8y50

Thanks for all the work! Can you turn those text lists into tables? It's hard to compare at a glance when the numbers aren't lined up.

samath8y10

Here's the relevant (if not directly analogous) Calvin and Hobbes story.

(The arc continues through the non-Sunday comics until February 1st, 1990.)

samath8y50

Thanks for reaching out! As a LW lurker, I've felt a bit of unease when I first heard about Intentional Insights or read the one HuffPo article, and it's taken me a while to discern where that unease came from.

One natural interpretation is that I'm not as comfortable with emotional appeals, and since that's expressly what you aim to give, it's going to rub someone like me the wrong way. If that's the case, InIn's community will just be a different subset of the population, probably bigger like you hope, and we should accept that.

A more concerning interpretation would assert that the style of LW content is also a big part of its identity, encouraging for instance long and deep reflection and high-quality discussion about difficult topics rather than instant-gratification social media responses. My own mindset while reading LW or SSC (or NYT, for that matter) is often very different from when I'm browsing Facebook or YouTube or ClickHole, and in the former, I feel that I'm more likely to take action based on what I've read.

Still, it's an interesting experiment, to see if the content of LW can be ported over to the BuzzFeed model without too much loss, and I'm glad someone's trying it. BTW, I don't have any contacts there or anything, but in the online Christian world, I would recommend trying RELEVANT Magazine as another place to publish.

samath9y40

Hmmm, I think a better word than "fantasy" here is "dystopia." Robertson is painting a bleak picture of a world where without moral authority, like the (much longer) bleak depiction of say, Fahrenheit 451 of a world without intellectual freedoms. Again, the natural reaction to reading Fahrenheit 451 or hearing Robertson isn't gleeful cackling, but shocked horror. "Something ain't right."

samath9y130

Sorry, but I'm guessing you don't spend much time around religious conservatives like Robertson. It's actually quite common among them to reason philosophically like this, mainly due to the emphasis on Christian apologetics. I'm sure Robertson has come across an argument of this form before and just reworked it for this.

Let me offer some more evidence. Listening to a recording of it, there are some chuckles in the audience at the beginning, but it grows silent by the end as most people grow more disgusted. The natural reaction, right in his last line, is, "Yes, something isn't right about this. Atheists do not deserve to be raped, murdered and castrated. The world would be quite chilling if we didn't have the moral authority to declare that some things are right and some things are wrong."

That's the complete opposite conclusion as, "Yes, atheists deserve to be tortured for believing there's no right and wrong." I honestly don't see how you think that could be the conclusion he wants you to reach. You don't promote the Holocaust by talking about how much pain the Jews would suffer in concentration camps. You use weasel words like "the final solution to the Jewish problem." Robertson is doing the exact opposite.

samath9y00

What I meant is that you could easily just define your ethics to include by definition "murder is bad" and it'd satisfy all of the other criteria (assuming you could coherently define murder). But if I imagine myself telling Robertson (or somone similar) that, they'd ask how I came up with that rule and why someone else couldn't just come up with the opposite rule "murder is good" and so it was just an arbitrary choice on my part.

samath9y70

As someone who has spent a lot of time with religious conservatives, I've heard the sort of argument given by Robertson many times before. And they use it as an actual argument used against nihilism, which they tend to think follows directly from atheism. So Scott is completely right to address it as such.

I think Robertson conflates the two because he (and others like him) can't really imagine a coherent non-arbitrary atheist moral realist theory. Can anyone here give a good example of one that couldn't include what the murderer he depicts seems to believe?

samath11y30

In an article proclaiming the transcendent use of complicated, modern statistics in baseball, and in particular, one called "WAR" (wins above replacement):

I'm not a mathematician and I'm not a scientist. I'm a guy who tries to understand baseball with common sense. In this era, that means embracing advanced metrics that I don't really understand. That should make me a little uncomfortable, and it does. WAR is a crisscrossed mess of routes leading toward something that, basically, I have to take on faith.

And faith is irrational and anti-intellectual, right? Faith is for rain dances and sun gods, for spirituality but not science. Actually, no. Faith is how we organize a complicated modern world. Faith is what you have when your doctor walks in with a syringe filled with something that could be anything and tells you that it'll keep you from getting the measles. Unless you're a doctor or a medical scientist, you don't really understand vaccines, and you certainly can't brew one up at home. You have outsourced the intellectual side of your health to people who, your faith reassures you, are smarter than you. Maybe in one way of looking at it you're not as smart as your great-great-great-grandparents were, because they had to take responsibility for cooking their own medicine. But you'll live longer. The complicated nature of WAR, your inability to touch the guts of it, isn't an argument against it. That's just what human advancement looks like in the 21st century. And if you can accept that you can walk into a tube built out of 100 tons of aluminum, fly seven miles off the ground and land safely thousands of miles away, you can accept WAR.