In response to Einstein's Arrogance
Comment author: Doug_S. 25 September 2007 09:13:39PM 1 point [-]

I thought that, when you try to apply general relativity to a world described by quantum mechanics, you end up trying to measure curvature of surfaces that do not have a well-defined curvature, much like how the curvature (derivative) of y = |x| is undefined at x=0?

I've heard several different descriptions of the "contradictions" between quantum mechanics and general relativity. One is that the mathematical functions used to define general relativity are undefined on the type of spacetime described by quantum mechanics; naively trying to apply one to the other requires you to find limits that do not exist (or something like that). Another explanation said that yes, you can create a quantum theory of gravity using a "naive" approach, but such a theory requires an infinite number of arbitrary physical constants and is therefore completely useless because 1) you can't actually measure an infinite number of physical constants and 2) if you don't measure them, the proper "choice" of constants can give you any result whatsoever, so it can't make any predictions about the actual universe.

By the way, has anyone else here had the thought that the reason quantum mechanics and general relativity are contradictory yet seem to predict reality perfectly is that "there's a bug in the code"?

Comment author: ohwilleke 12 July 2011 10:07:42PM 1 point [-]

The mathematical inconsistency between quantum mechanics and general relativity illustrates a key point. Most of the time the hypothesis set for new solutions, rather than being infnite, is null. It is often quite easy to illustrate that every available theory is wrong. Even if we know that our theory is clearly inconsistent with reality, we still keep using it until we come up with something better. Even if General Relativity were contradicted by some experimental discovery in 1963, Einstein would still have been lauded as a scientist for finding a theory that fit more data points that the previous one.

In science, and in a lot of other contexts, simply showing that a theory could be right, is much more important the establishing to any degree of statistical significance that it is right.

In response to Einstein's Arrogance
Comment author: James_Bach 25 September 2007 02:01:13PM 5 points [-]

Um, guys, there are an infinite number of possible hypotheses. Any evidence that corroborates one theory also corroborates (or fails to refute) an infinite number of alternative specifiable accounts of the world.

What evidence does is allow us to say "Whatever the truth is, it must coexist in the same universe with the true nature of this evidence I have accepted. Theory X and its infinite number of variants seems to be ruled out by this evidence (although I may have misinterpreted the theory or the nature of the evidence), whereas Theory Y and its infinite number of variants seems not yet to be ruled out."

Yeah, I realize this is a complicated way to phrase it. The reason I like to phrase it this way is to point out that Einstein did not have merely 29 "bits" of evidence, he had VAST evidence, based on an entire lifetime of neuron-level programming, that automatically focused his mind on a productive way of thinking about the universe. He was imagining and eliminating vast swaths of potential theories of the universe, as are we all, from his earliest days in the womb. This is hardly surprising, considering that humans are the result of an evolutionary process that systematically killed the creatures who couldn't map the universe sufficiently well.

We can never know if we are getting to the right hypothesis. What we can say is that we have arrived at a hypothesis that is isomorphic with the truth, as we understand that hypothesis, over the span of evidence we think we have and think we understand. Always the next bit of evidence we discover may turn what we think we knew upside down. All knowledge is defeasible.

Comment author: ohwilleke 12 July 2011 09:52:20PM 2 points [-]

There are not an infinite number of possible hypotheses in a great many sensible situations. For example, suppose the question is "who murdered Fred?", because we have already learned that he was murdered. The already known answer: "A human alive at the time he died.", makes the set finite. If we can determine when and where he died, the number of suspects can typically be reduced to dozens or hundreds. Limiting to someone capable of carrying out the means of death may cut 90% of them.

To the extent that "bits" of evidence means things that we don't know yet, the number of bits can be much smaller than suggested. To the extent that "bits" of evidence includes everything we know so far, we all have trillions of bits already in our brains and the minimal number is meaningless.

In response to Einstein's Arrogance
Comment author: ohwilleke 12 July 2011 09:44:31PM 0 points [-]

Einstein didn't come up with General Relativity that way. He didn't even do the hard math himself. He came up with some little truths (e.g. equivalence, speed of light is constant, covariance, must reduce to Newtonian gravity in unexceptional cases), from a handful of results that didn't seem to fit classical theory, and then he found a set of equations that fit.

Newtonian gravity provided heaps of data points and a handful of non-fits. Einstein bootstrapped on prior achievements like Newtonian gravity and special relativity and tweaked them to fit a handful of additional data points better. His confidence came from fitting 100% of the small available data set (something that wasn't clear in the case of the cosmological constant), however small it may have been. The minimum bit hypothesis assumes that all bits are created equal. But they aren't. Some bits advance the cause not at all, some bits advance it a great deal.

Similarly, the 27 bit rule for 100,000,000 people assumes that the bits have equal numbers of people who are yes and no on a question. In fact, some bits are more discriminating than others. "Have you ever been elected to an office that requires a statewide vote or been a Vice President?" (perhaps two bits of information), is going to eliminate 99.9999%+ of potential candidates for President, yet work nearly perfectly to dramatically narrow the field from the 100,000,000 eligible candidates. "Do you want to run for President?", cuts another 90%+ of potential candidates.

Einstein was confident because his bits had greater discriminatory power than other bits of information. There are only so many ways it is logically possible to fit the data he had.

Comment author: mstevens 31 March 2011 04:04:08PM 3 points [-]

On the "spock" front, I dislike the identification of "rational" with "Inhuman". These, too, are human qualities! However I certainly agree that many people do see this negatively.

There's an interesting tension in marketing plans - how far can we go in using marketing, which is normally about exploiting irrational responses, in pushing rationality?

If people see rationalists using irrational arguments to push rationality, does it blow our credibility?

Comment author: ohwilleke 31 March 2011 09:47:23PM 3 points [-]

One doesn't have to use irrational arguments to push rationality, but one of the lessons we draw from how people make decisions is that people simply do not make decisions about how to view and understand the world, even a decision to do so rationally, in an entirely rational way. The emotional connection matters as well.

Rational ideas proferred without an emotional counterpart wither. The political landscape is full of people who advanced good, rational programs or policy ideas or views about science that crashed and burned for long periods of time because the audience didn't respond.

Look at the argument of SarahC's original post itself. It isn't a philosophical proof with Boolean logic, it is a testimonial about the emotional benefits of this kind of outlook. This is prefectly valid evidence, even if it is not obtained by a "reasoning process" of deduction. In the same way, I took particular pride when my non-superstitutiously raised daughter won the highest good character award in her elementary school, because it showed that rational thinking isn't inconsistent with good moral character.

While one doesn't want to undermine one's own credibility with the approach one uses to make an argument, it is also important to defuse false inferences in arguments to oppose rationality. One of the false inferences is that rational is synonomous with ammoral. Another is that rational is synonomous with emotionally vacant and unfulfilling. A third is the sense that rationality implies that one use individual reason alone without the benefiit of a social network and context, because that is the character of a lot of activities (e.g. math homework or tax return preparation or logic problems) that are commonly characterized as "rational." Simple anecdote can show that these stereotypes aren't always present. Evidence from a variety of sources can show that these stereotypes are usually inapt.

When one looks at the worldview one chooses for oneself, it isn't enough to argue that rationality gives correct answers, one must establish that if gives answers in a way that allows you to feel good about how your are living your life. Without testimonials and other emotional evidence, you don't establish that there are not hidden costs which you are withholding from the audience for your statement.

Moreover, marketing, in the sense I am using the word is not about "exploiting irrational responses." It is about something much more basic - using words that will convey to the intended audience the message that you actually intend to convey. Care in one's use of words so as to avoid confusion in one's audience is quintessentially consistent with good practice of someone seeking to apply a rational method in philosophy.

Comment author: CharlesR 31 March 2011 03:43:52AM *  3 points [-]

I think Sam Harris gets it mostly right.

I think that “atheist” is a term that we do not need, in the same way that we don’t need a word for someone who rejects astrology. We simply do not call people “non-astrologers.” All we need are words like “reason” and “evidence” and “common sense” and “bullshit” to put astrologers in their place, and so it could be with religion.

Comment author: ohwilleke 31 March 2011 09:23:57PM 0 points [-]

"Reason" and "evidence based" are both quite nice words to convey the idea.

Comment author: taryneast 31 March 2011 02:56:01PM 0 points [-]

Have you heard of The Brights movement ?

It was kind of inspired by the gay movement as an attempt to find a word for atheism that was more socially acceptable ie without all the negative baggage, and embracing/popularising it.

Comment author: ohwilleke 31 March 2011 09:22:38PM 1 point [-]

I have, and even started to mention it, but figured that I was going too far afield. I think the problem there is that the established meaning of "Bright" as intelligent, overshadows the secondary meaning that is sought. I think "light" as a metaphor is promising, but the word "Bright" in particular, is inapt.

Comment author: ohwilleke 31 March 2011 02:28:27AM 0 points [-]

FWIW, I am inclined to think that "rationality" is a bad brand identification for a good thing. Rationality conjures up "Spock" (the Star Trek character) not "Spock" (the compassionate and wise child rearing guru). It puts an emphasis on a very inhuman part of the kind of human being you feel you are becoming.

Whatever it means in your context, as a brand to evangelize to others about its benefits, it is lacking. Better, in the sense of offering a positive vision, perhaps than "atheism" or "secularism" but not still not grounded and humane enough. I like "naturalist" better, although it is loaded with the connotation of bird watching, and also "humanist" although the term, without the modifier "secular" can mean little more than someone who gives a damn. "Enlightened" (as in the Enlightenment era) might be a good term if it weren't so damned arrogant in the modern vernacular.

The sense that I think you are trying to capture of something of the sense conveyed by the title to Carl Sagan's book "Demon Haunted World." You want to convey the joys of having exorcised the demons and opening yourself to seeing the world more clearly. But, to sell it to others, I think it is necessary to find a better marketing plan.

Comment author: ohwilleke 31 March 2011 02:16:10AM 0 points [-]

In the mental health area the polar extreme from the pathology model is the "neurodiversity" model. The point about allowing treatment when it is available and effective, whether the treatment is an "enhancement" or a "cure" is also worthwhile.

In the area of obesity, I think we are pretty open, as a society, to letting the evidence guide us. In the area of mental health, we are probably less so, although I do think that empirical evidence about the nature of homosexuality has been decisive in driving a dramatic change in public opinion about it.

A key concept that sums up your analysis, which you call "determinist consequenntialist" is the revelation that you should know why you want to use a word before you define it, and that a word may have different definitions that are appropriate for different contexts. A definition of disease designed to draw a line limiting covered medical expenditures may find an enhancement/pathology treatment distinction useful, while a definition of disease designed to distinguish between whether treatment should be available to those for whom ability to pay is not the issue might not.

Comment author: mtraven 29 March 2011 04:57:18PM *  4 points [-]

A few points:

  • Philisophy is (by definition, more or less) meta to everything else. By its nature, it has to question everything, including things that here seem to be unuqestionable, such as rationality and reductionism. The elevation of these into unquestionable dogma creates a somewhat cult-like environment.

  • Often people who dismiss philosophy end up going over the same ground philosophers trode hundreds or thousands of years ago. That's one reason philosophers emphasize the history of ideas so much. It's probably a mistake to think you are so smart you will avoid all the pitfalls they've already fallen into.

  • I agree with the linked post of Eliezer's that much of analytic philosophy (and AI) is mostly just slapping formal terms over unexamined everyday ideas, which is why I find most of it bores me to tears.

  • Continental philosophy, on the other hand, if you can manage to make sense of it, actually can provide new perspectives on the world, and in that sense is worthwhile. Don't assume that just because you can't understand it, it doesn't have anything to say. Complaining because they use what seems like an impenetrable language is about on the level of an American traveling to Europe and complaining that the people there don't speak English. That said, Sturgeon's law definitely applies, perhaps at the 99% level.

  • I'm recomending Bruno Latour to everyone these days. He's a French sociologist of science and philosopher, and if you can get past the very French style of abstraction he uses, he can be mind-blowing in the manner described above.

Comment author: ohwilleke 31 March 2011 01:56:49AM 2 points [-]

"Often people who dismiss philosophy end up going over the same ground philosophers trode hundreds or thousands of years ago."

Really? When I look at Aquinas or Plato or Aristotle, I see people mostly asking questions that we no longer care about because we have found better ways of dealing with the issues that made those questions worth thinking about.

Scholastic discourse about the Bible or angels makes much less sense when you have a historical-critical context to explain how it emerged in the way that it did, and a canon of contemporaneous secular works to make sense of what was going on in their world at the time.

Philosophical atomism is irrelevant once you've studied modern physics and chemistry.

The notion that we have Platonic a priori knowledge looks pretty silly without a great deal of massaging as we learn more about the mechanism of brain development.

Also, not all new perspectives on the world have value. Continental philosophy and post-modernism are to philosophy what mid-20th century art music is to music composition. It is a rabbit hole that a whole generation of academics got sucked into and wasted their time on. It turned out that the future of worthwhile music was elsewhere, in people like Elvis and the Beatles and rappers and Nashville studios and Motown artists and ressurrections of the greats of the classical and romantic periods in new contexts, and the tone poems and dissonant musics and other academic experiements of that era were just garbage. They lost sight of what music was for, just as the continental philosophers and post-modernist philosophers lost sight of what philosophy was for.

The language in impenatrable because they have nothing to say. I know what it is like to read academic literature, for example, in the sciences or economics, that is impenetrable because it is necessarily so, but that isn't it. People who use sophisticated jargon when it is really necessary are also capable of speaking much more clearly about the essence of what is going on - people like Richard Feynman. But, our modern day philosophical sophisticates are known to no one but each other and are not adding to large understanding. Instead, all of the other disciplines are busy purging themselves of all that dreck so that they can get back on solid ground.

Comment author: lukeprog 30 March 2011 01:44:30AM *  7 points [-]

Richard Chappell,

Of course, you know how intuitions are generally used in mainstream philosophy, and why I think most such arguments are undermined by facts about where our intuitions come from, which undermine the epistemic usefulness of those intuitions. (So does the cross-checking problem.)

I'll break the last part into two bits:

What I'm saying with the 'people are made of atoms' bit is that it looks like a slight majority of philosophers may now think that is at least a component of a person that is not made of atoms - usually consciousness.

As for intuitions trumping science, that was unclear. What I mean is that, in my view, philosophers still often take their intuitions to be more powerful evidence than the trends of science (e.g. reductionism) - and again I can point to this example.

I'm sure this post must have been highly annoying to a pro such as yourself, and I appreciate the cordial tone of your reply.

Comment author: ohwilleke 31 March 2011 01:40:36AM 1 point [-]

It seems to me that philosophy is most important for refining mere intuitions and bumbling around until we find a rigorous way of posing the questions that are associated with those intuitions. Once you have a well posed question, any old scientist can answer it.

But, philosophy is necessary to turn the undifferentiated mass of unprocessed data and potential ideas into something that is succeptible to being examined.

Rationality is all fine and good, but reason applies known facts and axioms with accepted logical relationships to reach conclusions.

The importance of hypothesis generation is much underappreciated by scientists, but critical to the enterprise, and to generate a hypothesis, one needs intuition as much as reason.

Genius, meanwhile, comes from being able to intuitively generate a hypothesis the nobody else would, breaking the mold of others intuitions, and building new conceptual structures from which to generate novel intuitive hypothesises and eventually to formulate the conceptual structure well enough that it can be turned over to the rationalists.

View more: Next