Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

In response to Nonperson Predicates
Comment author: John_Mlynarski 27 April 2017 01:36:50AM *  1 point [-]

"Is a human mind the simplest possible mind that can be sentient?" Of course not. Plenty of creatures with simpler minds are plainly sentient. If a tiger suddenly leaps out at you, you don't operate on the assumption that the tiger lacks awareness; you assume that the tiger is aware of you. Nor do you think "This tiger may behave as if it has subjective experiences, but that doesn't mean that it actually possesses internal mental states meaningfully analogous to wwhhaaaa CRUNCH CRUNCH GULP." To borrow from one of your own earlier arguments.

If you are instead sitting comfortably in front of a keyboard and monitor with no tiger in front of you, it's easy to come up with lots of specious arguments that tigers aren't really conscious, but so what? It's also easy to come up with lots of specious arguments that other humans aren't really conscious. Using such arguments as a basis for actual ethical decision-making strikes me as a bad idea, to put it mildly. What you've written here seems disturbingly similar to a solipsist considering the possibility that he could, conceivably, produce an imaginary entity sophisticated enough to qualify as having a mind of its own. Technically, it's sort of making progress, but....

When I first read your early writing, the one thing that threw me was an assertion that "Animals are the moral equivalent of rocks." At least, I hope that I'm not falsely attributing that to you; I can't track down the source, so I apologize if I'm making a mistake. But my recollection is of its standing out from your otherwise highly persuasive arguments as such blatant unsupported personal prejudice. No was evidence given in favor of this idea and it was followed by a parenthetical that clearly indicated that it was just wishful thinking; it really only made any sense in light of a different assertion that spotting glaring holes in other people's arguments isn't really indicative of any sort of exceptional competence except when dealing with politically and morally neutral subject matter.

Your post and comments here seem to conflate, under the label of "personhood," having moral worth and having a mind somehow closely approximating that of an adult human being. Equating these seems phenomenally morally dubious for any number of reasons; it's hard to see how it doesn't go directly against bedrock fairness, for example.

Comment author: arundelo 27 April 2017 11:20:38PM 0 points [-]

Eliezer probably means "sapient":

"Sentience is commonly used in science fiction and fantasy as synonymous with sapience, although the words aren't synonyms."

(Or maybe by "is sentient", he means to say, "is a person in the moral sense".)

Comment author: Thomas 18 April 2017 06:31:23AM 1 point [-]
Comment author: arundelo 18 April 2017 02:57:55PM 0 points [-]

This statement has the letter “T” at the beginning; the next two letters are “h” and “i”; which are followed by “s s”; … ; the first letter is then repeated inside double quotes; …

What do the ellipses ("...") mean?

Comment author: bogus 08 April 2017 05:26:11PM *  0 points [-]

"I would actually put myself in the top 5 percentile of people at my school in terms of cognitive ability and general awareness."

That's beautiful. We need a "Dunning–Kruger quote of the month" thread for this sort of stuff!

Comment author: arundelo 08 April 2017 07:44:31PM *  4 points [-]

We need downvotes for this sort of stuff. ^

Edit: By which I mean bogus's comment, which does nothing beyond insulting lifelonglearner. Also, I'd guess quite a few commenters on this website are in the 95th percentile of (say) IQ at their school.

Comment author: Bound_up 27 February 2017 02:11:37PM 0 points [-]

I'm looking for a link I saw on SSC once, with some poetry written by a woman who took drugs every day for a year or so. Any ideas?

Comment author: arundelo 27 February 2017 03:37:13PM *  2 points [-]

This was probably Aella, who took LSD every week for ten months.

I'm not finding the poetry on a quick scan of aellagirl.com but it rings a bell with me too. It might also be on aellagirl.tumblr.com (which, be warned, has a fair amount of NSFW images).

Comment author: Pimgd 15 February 2017 08:43:08PM 0 points [-]

Paper.... cups?

Comment author: arundelo 16 February 2017 01:18:40AM 2 points [-]
Comment author: bogus 14 February 2017 11:41:06PM *  0 points [-]

a participatory culture makes the notion of a skill-level hierarchy more apparent and well-defined

Not so. Fetishizing extreme 'skill', virtuosity, stardom etc. is a marker of a consumer culture, not a participatory one. One needs only look at what may be the earliest comprehensively-documented instance of a mostly- or entirely-passive musical culture, namely the elite dramma per musica a.k.a. opera seria, to be sure that this is the case! (This is something that people were keenly aware of at the time - which is why opera was eventually reformed to make it more "natural".)

"composers writing for composers" is a way of describing a participatory culture (which suggests that its use as a derogatory description is pathological).

Well, to the extent that they are indeed engaging in such an activity for the sake of participation, this is certainly true. But it's quite doubtful to me that even the "anti-populist fortress" of academia (as you put it later in your comment) can escape its own sort of consumer culture-ish dynamics. In this case, of course, such dynamics involve, not so much 'compromising' for the sake of mainstream popularity, but precisely the striving for some peak of perceived 'skill' and stardom (as judged e.g. by influential critics, patrons/funders and other outsiders), even when this is to the detriment of the overall interestingness and breadth of the participatory process.[1]

If so, this may be a case where all we can do is pick our place on a deeply uncomfortable tradeoff - since neither of these processes seems to track the values we would prefer! But a combination of the refragmentation you point to later on[2] and an increase in influence from the more self-consciously "low skill" and "DIY" subcultures seems to offer the best hope for improvement. Which is paradoxical indeed if you simply assume that these "DIY" subcultures must be "mass" cultures and thus cannot possibly help further an "elite" tradition.

[1] A further issue compounding this 'Moloch' problem, is that - much like J.S. Bach who thought of the Lutheran God as being his audience - most "academic" composers nowadays seem to be making music to worship Ra. The Lutheran God very possibly does not exist, but everyone has experienced 'Ra' in some form, and I for one don't think of Ra-worship as especially worthwhile, either artistically or in a broader social sense.

[2] (I had actually been taking this process for granted; I agree that the increasingly-consolidated, capital-intensive, "mainstream" music industry is obviously not a force for increased participation!)

Comment author: arundelo 15 February 2017 03:03:48PM 0 points [-]

a participatory culture makes the notion of a skill-level hierarchy more apparent and well-defined

Not so. Fetishizing extreme 'skill', virtuosity, stardom etc. is a marker of a consumer culture, not a participatory one.

For one thing, fetishizing skill is a fairly small component of contemporary popular music culture. For another, that's different from the skill-level hierarchy komponisto is talking about. As a musician (disclosure!), I expect a musician's judgment of another musician's skill level to be more accurate and finer-grained than the judgment of a non-musician.

Comment author: Qiaochu_Yuan 22 January 2017 06:46:10PM 1 point [-]

Again, that's assuming the conclusion; what if 1 - 0.999... weren't zero, and you picked that as epsilon? You're skipping steps. It's worth writing down exactly what you think is happening more carefully.

(To be clear, I'm not claiming that you've asserted any false statements, but I think there's an important sense in which you aren't taking seriously the hypothetical world in which 1 - 0.999... isn't zero, and what that world might look like. There's something to learn from doing this, I think.)

Comment author: arundelo 22 January 2017 07:40:21PM 1 point [-]

If I may, let me agree with you in dialogue form:

Alice: 1 = 0.999...
Bob: No, they're different.
Alice: Okay, if they're different then why do you get zero if you subtract one from the other?
Bob: You don't, you get 0.000...0001.
Alice: How many zeros are there?
Bob: An infinite number of them. Then after the last zero, there's a one.

Alice is right (as far as real numbers go) but at this point in the discussion she has not yet proved her case; she needs to argue to Bob that he shouldn't use the concept "the last thing in an infinite sequence" (or that if he does use it he needs to define it more rigorously).

In response to comment by Jade on Crisis of Faith
Comment author: Salemicus 07 January 2017 12:15:36PM 2 points [-]

Neither sufficient nor necessary:

  • The origins of Christianity become more mysterious, not less, if there never was a Jesus.
  • We don't need to tie ourselves to a fringe hypothesis to posit non-supernatural origins for the Gospels.
In response to comment by Salemicus on Crisis of Faith
Comment author: arundelo 07 January 2017 08:00:44PM 1 point [-]

Broadly speaking, I agree, and Jesus mythicist Richard Carrier would also agree:

[A]mateurs should not be voicing certitude in a matter still being debated by experts ([Jesus] historicity agnosticism is far more defensible and makes far more sense for amateurs on the sidelines) and [...] criticizing Christianity with a lead of "Jesus didn't even exist" is strategically ill conceived -- it's bad strategy on many levels, it only makes atheists look illogical, and (counter-intuitively) it can actually make Christians more certain of their faith.

But reading some of his stuff made me upgrade the idea that there was no historical Jesus from "almost certainly false" to "plausible". (Carrier has written a couple books on this -- Proving History: Bayes's Theorem and the Quest for the Historical Jesus and On the Historicity of Jesus: Why We Might Have Reason for Doubt -- but I haven't read those, only some stuff available on the web.)

  • Carrier:

    I think it is more likely that Jesus began in the Christian mind as a celestial being (like an archangel), believed or claimed to be revealing divine truths through revelations (and, by bending the ear of prophets in previous eras, through hidden messages planted in scripture). Christianity thus began the same way Islam and Mormonism did: by their principal apostles (Mohammed and Joseph Smith) claiming to have received visions from their religion's "actual" teacher and founder, in each case an angel (Gabriel dictated the Koran, Moroni provided the Book of Mormon).


    It would be several decades later when subsequent members of this cult, after the world had not yet ended as claimed, started allegorizing the gospel of this angelic being by placing him in earth history as a divine man, as a commentary on the gospel and its relation to society and the Christian mission. The same had already been done to other celestial gods and heroes, who were being transported into earth history all over the Greco-Roman world, a process now called Euhemerization, after the author Euhemerus, who began the trend in the 4th century B.C. by converting the celestial Zeus and Uranus into ordinary human kings and placing them in past earth history, claiming they were "later" deified (in a book ironically titled Sacred Scripture). Other gods then underwent the same transformation, from Romulus (originally the celestial deity Quirinus) to Osiris (originally the heavenly lord whom pharaohs claimed to resemble, he was eventually transformed into a historical pharaoh himself).

  • Carrier:

    [I]n Jewish cosmology, all sorts of things that exist or occur on earth also do so in heaven: fighting, writing, scrolls, temples, chairs, trees, gardens.

  • (To make the following paragraph more concise I'll omit hedge phrases like "according to Carrier". And even Carrier doesn't regard this as certain, only more likely than not.)

    The writings about Jesus that come the closest to being contemporary with his putative lifetime are Paul's seven or so authentic letters. Paul, who converted to Christianity after Jesus came to him in a vision sometime around 33 CE, never claims to have met the historical Jesus, and never unambiguously talks about Jesus as a human who lived on Earth. (E.g.: Paul talks about about Jesus being crucified, but this crucifixion took place in some celestial realm not on Earth. Paul mentions "James the Lord's brother", but this means not that James was a literal brother of Jesus of Nazareth but that James is a fellow Christian, the way a modern Christian might refer to their "brothers and sisters in Christ".)

Comment author: HungryHippo 21 December 2016 04:32:32PM 1 point [-]

Thank you for this reference!

Sharpening Your Forecasting Skills, Link

Are there any case histories of how superforcaster work, where they "show their work" as it were?

Comment author: arundelo 22 December 2016 07:25:12AM *  2 points [-]

I was a super-forecaster. I think my main advantages were 1) skill at Googling and 2) noticing that most people, when you ask them “Will [an interesting thing] happen?”, are irrationally biased toward saying yes. I also seem to be naturally foxy and well-calibrated, but not more so than lots of other people in the tournament. I did not obsess, but I tried fairly hard.


Edit: "Foxy" in this context means "knowing many small things instead of one big thing". See this pair (one, two) of Overcoming Bias posts by the late Hal Finney.

Comment author: NancyLebovitz 21 December 2016 12:28:31AM 3 points [-]

In the hopes of making things easier for me, I've been referring to centuries by their number range-- "the 1900's" rather than "the twentieth century". I've gotten one piece of feedback from someone who found it confusing, but how clear is it to most people who are reading this?

Comment author: arundelo 21 December 2016 03:11:29AM *  2 points [-]

Perfectly clear, and probably in most contexts less likely to elicit off-by-one errors. The only confusing things I can see are:

  • Maybe someone might think you just meant the first decade of the 1900s?
  • Similarly, is "the 2000s" a century or a decade or a millennium? (This and the previous problem are solved by using e.g., "19xx", but that's probably only clear in written language.)
  • This style (it seems to me) is more common with older stuff (e.g., the 1800s and 1700s), so someone might do a double-take at "the 1900s", thinking it sounds longer ago than it is.
  • There's also the thing of how the twentieth century is, if we're being pedantic, not the years 1900 through 1999, but the years 1901 through 2000.

View more: Next