All of grouchymusicologist's Comments + Replies

Well, exactly. That's what I meant when I said that it was very confusing to me, as a young grad student in an outside field, to have a course that assigned Peirce and Lacan side by side with a straight face, evidently taking them equally seriously.

There may or may not be some legitimate field of inquiry going under the name of semiotics. In grad school a number of years ago, however, I took a (graduate-level) Introduction to Semiotics that was a pretty remarkable hodgepodge of bullshit, along with just enough non-bullshit to make a complete outsider like myself (not at all fluent in the obscurantist discourse of "cultural studies," "critical theory," and the like) feel like maybe the problem was me and not the material. (Later reflection gave me a lot more confidence that the pro... (read more)

4SanguineEmpiricist
Just to be clear, when reading any of Charles Sanders Pierce i have never gotten a hint of "Charlatanism". Including Peirce among those names amounts to blasphemy.
0[anonymous]
Population size != sample size If you took a course on the subject, then your n is substantially greater than 1. At a minimum you have the professor's opinion plus the opinion of each self-described semiotician you read from in the course to draw from in reaching your conclusion.

Phil, you've probably seen this already, but a bunch of proposals for alternative notation systems are collected here. Some of them are basically exactly what you would prefer to be reading. It would be really cool if someone wrote a Lilypond package that could output in some of these systems. (Maybe someone has, I don't know.)

Very true. Staff notation essentially says "Here are the pitches and rhythms, now it's your job to figure out how to make them happen on your instrument." As you point out, a very real alternative to staff notation exists in tablature, which (in general) is any notation system that instead says "Here's what you need to do physically on your instrument. Follow these instructions and the notes will automatically be the right ones—you don't need to worry about what they 'are'."

Tablatures are surprisingly old, apparently going back 700 year... (read more)

I didn't have anything really radical in mind. I think it's pretty clear that there's a long-term trend toward high-level music-making relying on notation to a decreasing extent. I have a number of friends who are professional composers, and some of them use notation to write for instruments, while others use electronics and largely don't use notation at all. (The latter group, who compose for video games, movies, etc., are the ones who actually make money at it, so I'm by no means just talking about avant-garde electronic music.) A lot of commercial compo... (read more)

2komponisto
That I find more believable; but specialization is probably the wave of the future in general. I'm much more bothered by the prospect of interesting things dying out completely than that of their being "restricted" to a (possibly vibrant and vigorous) subculture. (These days I tend to think that most of "real" life takes place in subcultures or smallish communities -- maybe even cults! -- anyway.) I don't myself have enough data to confirm or deny this (I'm not a specialist in such topics either), but one should make sure to take into account the rest of the world: I have the impression, for example, that the Western art music tradition is currently in ascendance in China. (I also suspect in general that people's impressions of what past populations were like are biased toward reflecting the elites of past populations, about which information tends to be more readily and reliably transmitted, which they then compare to a more general cross-section of the current population visible to them.)

Good post and I'll chime in if you don't mind. I teach this stuff for a living and even highly skilled musicians struggle with it in various ways (myself emphatically included).

The main thing I want to say is that there's a reason why essentially all music education consists of many years of rote learning. Obviously, that rote learning works better if it's guided in appropriate directions, but I really don't know of any alternative to what you describe when you say "an orders-of-magnitude-less-efficient mechanism for memorizing note-to-note mappings f... (read more)

4[anonymous]
I know very little about this, but I have noticed guitar tabs, ocarina fingering guides, mouth harp tabs with simple numbers exist.
4komponisto
This seems like a radical claim. Can you clarify or elaborate? I certainly don't plan on stopping using notation any time soon. Indeed, this sort of statement seems to imply that composition as we most typically understand it (where a "composer" creates a "work" which nonidentical performances may be understood to be realizations "of", to possibly varying degrees of "accuracy") will stop, which seems highly unlikely to me. (I realize you only stated it as a comparative -- that this is more likely than some other unlikely thing -- but the "underway for decades now" comment suggests you take this as a serious possibility.) I actually like musical notation, and wish that its expressive possibilities were exploited (even) more. (However, I'm with you on interval nomenclature.)

Yes, it should be clarified. The main ambiguity that I was reacting to is that "art" can mean specifically visual arts or it can mean "the arts," extending to performing and literary arts. As it is, I'm not sure if my profession (scholarship concerning music) is "art" or "other."

In fact (now addressing Yvain again), why is this category called "Profession" instead of "field"? It creates some odd overlap with the previous category of "Work status" which produces a little bit of confusion per my original suggestion and fubarobfusco's reply.

On "Profession," the field label "Art" is vague. Better would be "Arts and humanities."

5fubarobfusco
"Humanities" includes philosophy, language, and religion; and sometimes history and law, too! I think what is meant in the survey is specifically the creation of art, e.g. design, sculpture, music, theater, fiction, and so on. It should be clarified, though.

I used to hear something similar in debates over gay marriage:

Gay person: "I only want to have the same right as a straight person: the right to marry the person I love."

Gay marriage opponent: "No no, you already have the same right as a straight person: the right to marry a person of the opposite sex. If you also want the right to marry a person of the same sex, you're asking for extra rights, special privileges just because you're gay. And that simply wouldn't be fair."

Edit: bramflakes beat me to it.

4Viliam_Bur
Generally: "Other people are also allowed to live in a society optimized for my utility function, so where is the problem?"
7Said Achmiz
Gay person: "No no, I want everyone to have the right to marry a person of the same sex, even straight people! Equal rights for all — that's perfectly fair."
-4Gunnar_Zarncke
The lesson for you: Write shorter.

Right. But, when exposed to it, some are drawn in and some run as fast as possible in the opposite direction. The point of the example was that there's a surprisingly large amount of individual variation on what kinds of fundamental sounds and timbres people find most pleasing, and (I cautiously suggest) that appears to be the most innate and least malleable or learnable aspect of a person's response to various kinds of music.

Just a couple of thoughts about this. First, as far as anyone can tell music enjoyment is a remarkably multifaceted phenomenon (and "music" itself is a term that describes a pretty giant range of human behaviors). There's no single reason, or even manageably short list of reasons, why people like it. It seems to be wrapped up in many different physical, neurological, cognitive, emotional, social, and cultural systems, any of which (in any combinations) could be responsible for a certain person's reaction to a certain kind of music. Some of the as... (read more)

0ShardPhoenix
I don't think many people are born enjoying noise music - I imagine they mostly ease into via other genres.
6MrMind
The same things happen to me in reverse: I find industrial music (pop or metal) quite pleasing, but the whole point of industrial is to add factory noise (for example those typical of a sawmill) to otherwise plain music, so I at least can understand why as a genre it doesn't have a wide community of supporters.

This is strictly pop-science writing, but there was an interesting piece in the NYT Magazine a couple of years ago about ketosis as a treatment for pediatric epilepsy, where apparently it's extremely effective at controlling seizures in a significant fraction of patients.

2Lumifer
Ketosis is the traditional treatment for pediatric epilepsy going back to 1920s or so. The diet is based on eating, erm, drinking heavy cream -- lots and lots of it and little else. It helps with the epilepsy and doesn't seem to have major effects on health otherwise (which confuses/annoys great many dieticians). The main problem with the diet is its monotony and getting the kids to stick with it.

I don't think I understand at all what these descriptions of confidence levels are supposed to mean. Do they refer to your confidence in specific pieces of information about the people in the descriptions? Information you heard from those people? What scenario does the business about email addresses envision?

EDIT: Apologies, I now see the parenthetical "(being applied to identity verification, where possible)," which I managed to completely overlook on a first reading. Please ignore the above criticism, but you still might want to make the deciban descriptions more explicit.

0DataPacRat
A good point, which I'll add to the to-do list.

My pleasure, glad it seems useful.

Sounds like you have some good, concrete ideas about how to proceed. Contacting professors whose work interests you, to ask about graduate study in their departments and/or labs, is certainly a necessary step.

Throughout academia, we have a rule of thumb: do not ever, ever, spend any of your own money or go into debt for a PhD. That means that any place at which you should give the slightest consideration to doing graduate work should offer you a full waiver of tuition, plus a modest income ("stipend") and health insurance, for the duration of a r... (read more)

0Michelle_Z
Thank you! This was well written and very helpful!

Thanks for this post. Whatever problems the JTB definition of knowledge may have—the most obvious one of those to LWers probably being the treatment of "knowledge" as a binary condition—the Gettier problem has always struck me as being a truly ridiculous critique, for just the reasons you put forward here.

Scott Lemieux once called this the "my-utopia-versus-your-grubby-reality asymmetry," a delightful turn of phrase which has stuck with me since I read it.

Although Lemieux was talking about something subtly different from, or possibly a subset of, what you're talking about: the practice of describing the benefits of your own preferences as if you could completely and down to the smallest detail redesign the relevant system from scratch, while insisting on subjecting your opponent's preferences to a rigorous "how do we get there from here"... (read more)

3HalMorris
I like that, but maybe it's just a bit too long to stick. It seems so automatic in so many people, that I wouldn't be surprised if one day an associated neural mechanism was discovered.

"That" if you're a grammar Nazi; either one if you're a professional linguist or mere native speaker of English. :)

A big +1 to this and it echoes in many respects my advice here to a similar question. What you hit upon here that I did not do in that comment is the importance of understanding the etiology of one's new belief.

Yeah, but it might be useful to know what the person in question considers to have been the crucial aspects of their procedure, as opposed to merely ancillary aspects. This won't be failproof but will at least have better than chance odds of contributing something useful to the advice.

6Vladimir_M
That depends on whether this person is motivated by a real desire to benefit you, or a desire to sound in a way that has maximum signaling value. (Note that the latter can be the case even when people honestly believe they're doing the former, unless they have a special and extraordinary degree of altruism towards you, which is typically the case only for close family and friends.)

For starters, I'd say it would be best to take advice from people whose careers and accomplishments are to some extent a matter of public record. Then you can evaluate (a) whether they seem to have actually accomplished the things they're trying to teach you to accomplish, and (b) whether they seem to have accomplished those things via the procedure they're encouraging you to follow. If yes to both, then you might proceed further.

In that case, the problem of making good advice seem too easy might come down to a couple of things. First, you want to see a go... (read more)

0DanArmak
Advice about what to do if you have a brilliant idea for a business is still valuable for some people.
7Grognor
The expert-at vs. expert-on distinction severely weakens this meta-advice. See also Unteachable Excellence.
5Vladimir_M
If you are privy to the information necessary to evaluate (b), you can just look at it directly and skip listening to the advice altogether.

Glad you're doing this and sorry that I do not currently have time to proofread a batch of them myself.

This might be the only time ever that I can mention this without automatically sounding like an asshole, so here goes: Eliezer, whose writing is generally amazingly consistent and well-proofread with respect to style and punctuation, has the habit of using a hyphen, surrounded by single spaces, in place of a dash. He's far from alone in doing this, and it's an entirely reasonable habit to have given that the hyphen is an ASCII character but dashes aren't.... (read more)

1shokwave
Hmm. I wonder if markdown should convert hyphens surrounded by spaces to dashes.
5wedrifid
Absolutely. The " - " issue occurs in all of Eliezer's work (and to be fair most work by folks working with an ascii-like medium). We've recently prepared a manuscript for the first 17 chapters of Methods of Rationality to be sent off to be published as hard copy. In that source I performed the substitution :s/ - /---/g across the board. I'll have to be a little more careful with some of the sequence conversion---that particular character combination also occurs in equations that we'll be manually converting to the LaTeX equation environment.
5Viliam_Bur
Bugs like this could be found in LaTeX source using regular expressions. If a thing like this happens, it probably happens more than once in a long text. So when humans find a first example, computer could detect all remaining examples of the same pattern. (I don't recommend automatical fixing of these bugs, only reporting them.)

If you want to be a professional biologist (or any professional scientist) you will probably need to get one or more graduate degrees. (There are exceptions to this, but your career possibilities will be more limited.) This complicates matters in some respects and simplifies it in others. Let me mostly focus on ways it simplifies matters.

  • Going to grad school will give you the chance to move someplace for a few years before eventually moving on to yet someplace else. This is great for a few reasons. You get a chance to see what it's like to move -- maybe
... (read more)
0Michelle_Z
Oops! Somehow I managed to forget to respond to these a year ago! Thanks for the advice! I've taken steps - like exploring my interests in the sciences in an attempt to figure out what specifically do I want to research- and plan to figure out which professors in which colleges are doing that kind of research. -Do you know the general requirements to get that kind of funding? I'm certain I'll need it. I've researched it and have found varying and sometimes contradictory information.

Well, at the risk of explaining my joke, I only meant to suggest that the opening of the chapter makes it sound like Beck thinks Beethoven's Fifth would have been "famous" and instantly recognizable to Englishmen in 1678. Maybe I should charitably assume that Beck originally had it as "the latest church anthem by Purcell" but his editors made him change it.

8lukeprog
The wording seems to be ambiguous as to whether it's saying Beethoven's Fifth is "famous" to the readers or "famous" to people in 1678.

He should have brought Archimedes's Chronophone with him instead!

(ugh, I'm sorry for that.)

The chapter begins with a pretty delightful infelicity, since in 1678 Beethoven's Fifth Symphony was still 130 years away from its premiere. Granted, this is very specialized knowledge available only to professional musicologists like myself and I doubt Beck's publisher can afford my consulting fees.

(I can just imagine the English scientists standing around wondering why this lunatic is inflicting this cacophony on them and looking at them so expectantly.)

pretty sure beethoven's fifth would be impressive coming out of a phone whether or not you knew what it was.

I see it as being like the Chuck Berry scene in Back to the Future.

2knb
Umm, actually he doesn't say that they recognized it, just that they would be amazed that it makes music. That seems pretty plausible to me. Seems like you got mind-killed pretty hard here.

The chapter begins with a pretty delightful infelicity, since in 1678 Beethoven's Fifth Symphony was still 130 years away from its premiere.

If Siri made the journey back in time, why are you surprised that an mp3 of Beethoven's 5th made the journey? Siri was created slightly later than Beethoven's 5th.

Somewhat more amazing is that this iPhone has cellular service in the 17th century, and can make video calls to future people. It must be on Verizon.

3JoshuaZ
Or where the "scientists" would come from since that term wouldn't exist for another 150 years or so.

You know, this is one of those cases (coming out as GLBT would be another one) where we sometimes have to, in essence, parent our parents. Be the patient grownup while they have their temper tantrum, and after they calm down be willing to forgive the hurtful, ridiculous things they said. I think it's more than reasonable to say you'll only talk to her about this when she can be at least calm about it. Encourage her to ask you questions and answer them honestly. Reassure her that nothing about your relationship with her has changed -- she has no need to fee... (read more)

5Michelle_Z
Probably not the best time for factual correctness, true. I'll try what you suggested.

Basically, I disagree with this. A few thoughts:

(1) Even if we were all perfectly rational, it'd still take time to research the optimal answer to every question. Why shouldn't I outsource that research to people who are interested in doing it and whose basic viewpoints I trust? [Edit: RomeoStevens already made this point above.]

(2) What's the harm to you from posts on "applied rationality" topics being posted on LW? Don't read or comment on what you aren't interested in. If you prefer posts on the theory and practice of rationality itself, just ... (read more)

So, this is about developing a photographic memory for text, one paragraph at a time. Is that really something you want? Why not make an Anki flashcard out of the one thing (or more, if it's a really information-dense paragraph) you most want to remember from the text?

3falenas108
This isn't about memorizing that one paragraph. It's about training your brain to remember things after only seeing it for a split second. Of course, that's assuming it works.

Thanks for the pointer, I'll check it out.

Leo, in my line of work, is one of the most useful resources on the web. I don't think there are comparable resources for any other languages besides German, which is really too bad. (If anyone knows different, please link!)

2[anonymous]
There's Wordreference.com, which is very nice. Not quite as super special awesome as Leo, but nice.

It's a curiosity stopper in the sense that people don't worry any more about risks from AI when they assume that intelligence correlates with doing the right thing, and that superintelligence would do the right thing all the time.

Stuart is trying to answer a different question, which is "Given that we think that's probably false, what are some good examples that help people to see its falsity?"

I guess I think this is, at best, only part of your true rejection. If there were some visionary artist who wanted to create art that would get thousands of people interested in the SIAI cause, such that donations poured in and some bright mathy kids decided to help solve FAI problems, I have a feeling you'd tell that artist "Go for it, with our gratitude."

(Ahem.)

This would in no way entail converting that person into anything other than a "pure" artist. There would be no need for that person to become the kind of highly flexible SIAI r... (read more)

If both your work and your procrastination are computer-based (and isn't that a concise description of all my problems!), Beeminder plus TagTime looks like a pretty promising combination. Beeminder keeps track of personal goal-related data for you, and TagTime is a random sampling-based way of seeing how you spend your computer time. They're put out by (at least some of) the same people, and TagTime can automatically send your data to the relevant Beeminder graph.

NB: TagTime is only available in a developer version right now, which means that I haven't tri... (read more)

2dreeves
Thanks for the plug for Beeminder and TagTime! They are indeed by exactly the same people, me and Bethany Soule. In case anyone missed our big pre-launch thing here on LessWrong: http://lesswrong.com/lw/7z1/antiakrasia_tool_like_stickkcom_for_data_nerds/ And, yes, TagTime+Beeminder is an amazing combination, IMHO. We'd love to get a friendlier version of TagTime out the door. There is an Android app that Bethany wrote that's friendlier than the desktop version, but I think there's a lot less value for it on a phone than on your main work computer.
2Manax
Interestingly I wrote something very similar to tagtime a number of years ago, and am still using it. I don't do random sampling (didn't think of it at the time), but at 15 minute intervals. I've got short cuts and defaults to remember the last thing I was working on, automatic (and manual) time division when I've worked on multiple projects in the interval. Over the last year, I've gotten it the point where it automatically fills in timesheets for me. Mine too is Perl. Of course, this sort of thing only works as long as you're honest about what you're working on. Sometimes I'm very good about being honest when I've gone off-task, sometimes less so. But it's easy to go back through my logs and find out how much time I've wasted when I intended to be working.

That's a great song -- I hadn't heard it before and it's satire at its finest.

This is tangential, but I noticed that it's a very literal parody (especially at the beginning) of "I Have Confidence" from The Sound of Music, which (while not exactly a rationalist anthem or anything) is a song about the virtue of shutting up and doing the impossible, when you have to.

1Raemon
Huh. That makes the song even funnier.

Interesting ideas -- I can think of a few more. On the "smart = more moral" side:

  • Smart people are more likely to be able to call on Kahneman's System 2 when necessary, which correlates with utilitarian judgments (see this paper by Joshua Greene et al.). Similarly, they're more likely to have the mental resources to resist their worst impulses, if they want to resist them.
  • Note that some of your "smart = less moral" proposals concern a world in which some people are much smarter than others. If cognitive enhancement were widespread, we
... (read more)

This suggests to me that Task #1 is finding ways for people to engage with your ideas without involving a status competition between you and them.

I think this is exactly right. In other words, people who don't yet know how to leave themselves a line of retreat might, at the outset, need us to do it for them.

4fiddlemath
Having a "line of retreat" -- feeling like the world won't end if they change their mind -- is part of it. But the problem, here, is yet more general. A lot of people at my local meetup are people who I trust and like, and who accept the value of updating on evidence -- and even show respect for someone changing their minds! Even in discussions with these friends, I notice fear and loss when I lose a friendly argument. Admitting that you are wrong is a loss of status, even if only your interlocutor is watching. Every argument has aspects of status competition. I notice all this in myself, and I suspect that freely admitting when I'm wrong is one of my strongest rationalist abilities. In others, it's pretty obvious. I can watch all this happen in their faces, when admitting that they're wrong requires an active force of will. So, if we can find some setting for truth-seeking arguments without the status competition, we improve how quickly we learn from each other. If somehow we can remove the status competition from a discussion we're having with anybody, anywhere, then we remove significant barriers to communication.

Good for you for learning this material. Let me know if you want more suggestions for things to read concerning group theory and music theory.

Wow, that's very interesting. I haven't seen any use of Bayesian methods along similar lines in music theory -- that is, to try to account for otherwise opaque compositional motivations on the part of an individual composer. I look forward to reading the article more closely, thank you for passing it along.

Where Bayes is beginning to crop up more often is in explicitly computational music theory, such as corpus music research and music cognition. I have a colleague who (among other things) develops key-finding algorithms on a large corpus of tonal music, i... (read more)

I think how important these criticisms are depends on who the intended audience of the essay is -- which Gwern doesn't really make clear. If it's basically for SIAI's internal research use (as you might think, since they paid for it), tone probably hardly matters at all. The same is largely the case if the intended audience is LW users -- our preference for accessibly, informally written scholarly essays is revealed by our being LW readers. If it's meant as a more outward-facing thing, and meant to impress academics who aren't familiar with SIAI or LW and ... (read more)

That's it, thanks. I should have known it was on Less Wrong!

I wish I could remember where I read this (or even in what academic field). But some academic once wrote that his most acclaimed, most cited papers were always the ones he thought of as mere summaries of existing knowledge. This made a strong impression on me. In most cases when dealing with high-level ideas, very good restatements of previous research are not only valuable, but likely to make those ideas click for some non-trivial number of readers. A few other thoughts:

... (read more)
5Ben
I can completely believe that these papers were successful (as measured by citations for example), but that does not necessarily mean they were the most useful papers or that people got the most out of them. In a typical paper, somewhere in the introduction it will be necessary to say some basic "establishing the field" statements. Academics want to support these statements with references. A reference that says some basic thing, in plain words with very little technical hedging, is much easier to find and cite than a series of more targeted and precise points that add up to the same thing. At least in my field the papers that get the most citations are exactly these introduction citation ones. A good example of an arguably "obvious" result making big waves is DiVincenzo's criteria. https://en.wikipedia.org/wiki/DiVincenzo%27s_criteria. I don't think that many people really would have questioned that a "useful quantum computer" needed to be able to [I have changed the order of the 5 criteria]  (2) write data, (4) do logic gates. (5) read data. While also being  (1) big enough to be useful and (3) quantum. Its not a million miles from a tautology, with (1) and (3) translating to "useful" and "quantum" and (2,4,5) being pre-requisites for a thing to be called a computer. But, if I am writing a paper introduction saying "X satisfies the Divincenzo criteria [1]" sounds so much cooler and more considered than "X is a possibly a good platform for quantum computing". [I am sure that spelling out the criteria was useful. But suspect that the level of attention, as measured by citation, is probably outsized relative to the usefulness.]
7[anonymous]
Mathematician Gian-Carlo Rota also made a similar comment in his 10 Lessons I Wish I Had Been Taught, giving some examples of mathematicians better known for their expository work.
4lukeprog
Darn, I wish I'd come up with that line myself! :)

Perhaps you read it here: Explainers Shoot High. Aim Low!:

A few years ago, an eminent scientist once told me how he'd written an explanation of his field aimed at a much lower technical level than usual. He had thought it would be useful to academics outside the field, or even reporters. This ended up being one of his most popular papers within his field, cited more often than anything else he'd written.

Addendum: With his gracious permission: The eminent scientist was Ralph Merkle.

Carrier's book may be seen as the first salvo in that attack, but this makes me wish his case had not been presented in the context of such a parochial and disreputable sub-field of history as Jesus Studies.

Boy, do I ever agree with this. I would love to be able to cite Carrier's work (edit: that is, his methodological program) without appearing to take on the baggage of interest in an area that is simultaneously irrelevant and mindkilling -- that is, in which having opinions might be taken as chiefly an indication of tribalism.

Certainly, to really get ... (read more)

6komponisto
By the way, I came across an example of Bayesian methods being employed in a music theory paper. Since you're probably more familiar with the recent theoretical literature than I am, do you perhaps know whether this is a trend? (I think Bayes does belong in music theory, but the way I have in mind is a bit different from what we see in that paper.)

I have a grudge against this topic for two reasons. One is that this is how I discovered that the tribal signaling aspect of belief doesn't follow the expected conjunction rules. Among secular Jews,

  • Professing that "Moses probably didn't exist" gets a "meh."
  • Professing that "Jesus probably existed" gets a "meh."
  • Professing that "Moses probably didn't exist and Jesus probably did" gets a "You goddamn self-hating Jew! You can't prove anything!"

The other is that people trying to write the phra... (read more)

That quote is kind of awesomely terrible. Sure, as everyone knows, all fields of human endeavor have exactly the same kind of purpose!

Ok, it's undoubtedly true that de Botton and I share a good many values. But I do insist that his current project strikes me as incredibly misguided if not outright stupid. I would expect him to be quite resistant to an SIAI-like program of answers to the kinds of "philosophical" questions he's asking. He seems to believe that religious leaders, despite basing their teachings on their totally groundless factual claims about reality, are important moral teachers who must be taken with utmost seriousness. And he believes that (for example) Richard ... (read more)

3[anonymous]
.

It's a nice quote, and correct as far as it goes. "We raise these questions not in order to provide definitive answers, but in order to stimulate questioning" is an annoying trope. However, a few thoughts:

  • There may be some value in finding definitive answers offputting. Namely, if one values definitive answers too highly, one may be excessively compelled to prematurely proclaim one's answers definitive! But this isn't to say that definitive answers would not be desirable when they can be achieved.
  • I doubt the attitude he describes is as prevale
... (read more)
3bogus
Alain de Botton is quite possibly correct that "religious institutions and rituals" supporting an ethical system could exist without involving any theistic cosmology or similar doctrines. Confucian 'religion' is a case in point. Yes, Confucianism evolved from Chinese ancient religion, but it developed independently over many centuries as a non-theistic system. The same process could occur with modern Western morality, which historically evolved from Protestant millennialist Christianity.
[anonymous]100

I am very, very wary of wading into anything approaching a debate with you, given my respect for you. But I feel that this comment assumes an unrealistic picture of how time/money tradeoffs work in most people's lives. Most of us do not have direct ways we can translate a couple of spare minutes into the corresponding amount of money, and even if we did, we aren't perfect utilitarians who always make as much money as we possibly can and then donate every remaining penny to the most efficient possible charity. If anyone is that kind of person, they should i... (read more)

-5A1987dM

Thus the disclaimers.

then you should probably consider that "free" time

However, do note that the generalization of that argument would allow a vast number of posts asking for the use of "free time" on various less effective causes. Other matching grants (this one is expiring now, but there will normally be something in the same ballpark), petitions, tasks on Mechanical Turk, and so forth could be summoned.

If there were 100 such posts each month responding to them would clearly on average be a drain on your other activities, requi... (read more)

I've done this, and it is as easy as atorm says it is. I've also been the beneficiary of stem cell donation: my mother is currently alive and has a normal life expectancy after receiving a transplant that cured her of leukemia. She would otherwise have died within months of her diagnosis. Some years after her transplant, she was able to correspond with her stem cell donor, who told her that the donation was as simple as going to his local hospital and having his blood drawn. (These days, an agonizing, old-fashioned bone marrow transplant is rarely if ever ... (read more)

Load More