You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Truth is holistic

9 MrMind 23 April 2015 07:26AM

You already know by now that truth is undefinable: by a famous result of Tarski, no formal system powerful enough (from now on, just system) can consistently talk about the truth of its own sentences.

You may however not know that Hamkins proved that truth is holistic.
Let me explain: while no system can talk about its own truth, it can nevertheless talk about the truth of its own substructures. For example, in every model of ZFC (the standard axioms of set theory) you can consistently define a model of standard arithmetics and a predicate that works as arithmetics' truth predicate. This can happen because ZFC is strictly more powerful than PA (the axioms of standard arithmetics).
Intuitively, one could think that if you have the same substructure in two different models, what they believe is the truth about that substructure is the same in both. Along this line, two models of ZFC ought to believe the same things about standard arithmetics.
However, it turns out this is not the case. Two different models extending ZFC may very well agree on which entities are standard natural numbers, and yet still disagree about which arithmetic sentences are true or false. For example, they could agree about the standard numbers, how the successor and addition operator works, and yet disagree on multiplication (corollary 7.1 in Hamkins' paper).
This means that when you can talk consistently about the truth of a model (that is, when you are in a more powerful formal system), that truth depends not only on the substructure, but on the entire structure you're immersed in. Figuratively speaking, local truth depends on global truth. Truth is holistic.
There's more: suppose that two model agree on the ontology of some common substructure. Suppose also that they agree about the truth predicate on that structure: they could still disagree about the meta-truths. Or the meta-meta-truths, etc., for all the ordinal levels of the definable truth predicates.

Another striking example from the same paper. There are two different extensions of set theory which agree on the structure of standard arithmetics and on the members of a subset A of natural numbers, and yet one thinks that A is first-order definable while the other thinks it's not (theorem 10).

Not even "being a model of ZFC" is an absolute property: there are two models which agree on an initial segment of the set hierarchy, and yet one thinks that the segment is a model of ZFC while the other proves that it's not (theorem 12).

Two concluding remarks: what I wrote was that there are different models which disagrees the truth of standard arithmetics, not that every different model has different arithmetic truths. Indeed, if two models have access one to the truth relation of the other, then they are bound to have the same truths. This is what happens for example when you prove absoluteness results in forcing.
I'm also remembered of de Blanc's ontological crises: changing ontology can screw with your utility function. It's interesting to note that updating (that is, changing model of reality) can change what you believe even if you don't change ontology.

Signalling with T-Shirt slogans

7 Gunnar_Zarncke 21 December 2014 11:37AM

It kind of started when I got this T-shirt as a present two years ago:

Don't Drink and Derive T-Shirt

It is not just a slogan that is quickly filtered out under the heading 'generic ad-like content'. It invites checking where the error is. It is kind of a challenge - at least for suitably minded persons. Exactly that kind of person I'd like to get in touch with more. This T-shirt signals: "I'm a nerd and proud of it." And the positive feedback I got from this was part of the reason I chose to signal this more. Maybe you'd like to signal this too. Please remember the T-shirt alone will not do it. You still have to talk to people. For the introverted among us (me included) I recommend active listening.

The remainder of this post lists some slogans I have tried, some I will likely try shortly and other related resources.

continue reading »

Truth vs Utility

1 Qwake 13 August 2014 05:45AM

According to Eliezer, there are two types of rationality. There is epistemic rationality, the process of updating your beliefs based on evidence to correspond to the truth (or reality) as closely as possible. And there is instrumental rationality, the process of making choices in order to maximize your future utility yield. These two slightly conflicting definitions work together most of the time as obtaining the truth is the rationalists' ultimate goal and thus yields the maximum utility. Are there ever times when the truth is not in a rationalist's best interest? Are there scenarios in which a rationalist should actively try to avoid the truth to maximize their possible utility? I have been mentally struggling with these questions for a while. Let me propose a scenario to illustrate the conundrum.

 

Suppose Omega, a supercomputer, comes down to Earth to offer you a choice. Option 1 is to live in a stimulated world where you have infinite utility (on this world there is no, pain, suffering, death, its basically a perfect world) and you are unaware you are living in a stimulation. Option 2 is Omega will answer one question on absolutely any subject truthfully pertaining to our universe with no strings attached. You can ask about the laws governing the universe, the meaning of life, the origin of time and space, whatever and Omega will give you a absolutely truthful, knowledgeable answer. Now, assuming all of these hypotheticals are true, which option would you pick? Which option should a perfect rationalist pick? Does the potential of asking a question whose answer could greatly improve humanity's knowledge of our universe outweigh the benefits of living in a perfect simulated world with unlimited utility? There is probably a lot of people who would object outright to living in a simulation because it's not reality or the truth. Well lets consider the simulation in my hypothetical conundrum for a second. It's a perfect reality and has unlimited utility potential, and you are completely unaware you are in a simulation on this world. Aside from the unlimited utility part, that sounds a lot like our reality. There are no signs of our reality of being a simulation and all (most) of humanity is convinced that our reality is not a simulation. There for, the only difference that really matters between the simulation in Option 1 and our reality is the unlimited utility potential that Option 1 offers. If there is no evidence that a simulation is not reality then the simulation is reality for the people inside the simulation. That is what I believe and that is why I would choose Option 1. The infinite utility of living in a perfect reality outweighs almost any utility amount increase I could contribute to humanity.

I am very interested in which option the less wrong community would choose (I know Option 2 is kind of arbitrary I just needed an option for people who wouldn't want to live in a simulation). As this is my first post, any feedback or criticism is appreciated. Also many more information on the topic of truth vs utility would be very helpful. Feel free to down vote me to oblivion if this post was stupid, didn't make sense, etc. It was simply an idea that I found interesting that I wanted to put into writing. Thank you for reading.

Attempting to rescue logical positivism

3 RolfAndreassen 25 April 2013 06:20PM

Very brief recap: The logical positivists said "All truths are experimentally testable". Their critics responded: "If that's true, how did you experimentally test it? And if it's not true, who cares?" Which is a fair criticism. Logical positivism pretty much collapsed as a philosophical position. But it seems to me that a very slight rephrasing might have saved it: "All _beliefs_ are experimentally testable". For if the critic makes the same adjustment, asking "Is that a belief, and if so -" you can interrupt him and say, "No, that's not a belief, that's a definition of what it means to say 'I believe X'."

A definition is not true or false, it is useful or not useful. Why is this definition useful? Because it allows us to distinguish between two classes of declarative statements; the ones that are actual beliefs, and the ones that have the grammatical form of beliefs but are empty of meaningful belief-content.

It seems to me, then, that both the positivists and their critics fell into the trap of confusing 'belief' and 'truth', and that carefully making this distinction might have saved positivism from considerable undeserved mockery.

Evolutionary psychology as "the truth-killer"

10 Benedict 23 July 2012 08:44PM

So, a little background- I've just come out as an atheist to my dad, a Christian pastor, who's convinced he can "fix" my thinking and is bombarding me with a number of flimsy arguments that I'm having trouble articulating a response to, and need help shutting down. The particular issue at the moment deals with non-theistic explanations for human psychology and things like love, morality, and beauty. After attempting to communicate explanations from evolutionary psychology, I was met with amused dismissal of the subject as "speculation". 

There's one book in particular he's having me read- The Reason for God by Timothy Keller. In the book, he brings up evolutionary psychology as an alternative to theistic explanations, and immediately dismisses it as apparently self-defeating.

"Evolutionists say that if God makes sense to us, it is not because he is really there, it's only because that belief helped us survive and so we are hardwired for it. However, if we can't trust our belief-forming faculties to tell us the truth about God, why should we trust them to tell us the truth about anything, including evolutionary science? If our cognitive faculties only tell us what we need to survive, not what is true, why trust them about anything at all?" -Timothy Keller

The obvious answer is that knowing the truth about things is generally advantageous to survival- but it hardly addresses the underlying assertion- that without [incredibly specific collection of god-beliefs and assorted dogmas], human brains can't arrive at truth because they weren't designed for it. And of course, I'm talking to a guy with an especially exacting definition of "truth" (100% certainty about the territory)- I could use an LW post that succinctly discusses the role and definition of truth, there. 

Another thing Dad likes to do is back me into a corner WRT morality and moral relativism- "Oh, but can you really believe that the act of rape doesn't have an inherent [wrongness]? Are you saying it was justified for [insert historical monster] to do [atrocity] because it would make him reproductively successful?" Armed only with evolutionary explanations for their behavior, I couldn't really respond- possibly my fault, since I haven't read the Morality sequence on account of I got stuck in the Quantum Physics ultrasequence, and knowing that reality is composed of complex amplitudes flowing between explicit configurations or aaasasdjgasjdga whatever the frig even (I CAN'T) has proven to be staggeringly unhelpful in this situation.

In addition to particular arguments WRT the question posed, I could also use recommendations for good, well-argued and accessible books on the subject of evolutionary psychology, with a focus on practical experimental results and application- the guy can't be given a book and not read it, so I'm hoping to at least get him to not dismiss the science as "speculation" or a joke. It's likely he's aware that the field evolutionary psychology is really prone to hindsight bias and thus ignores it completely, so along with the book, a good article or study demonstrating the accuracy and predictive power of the evolutionary psychological model would be appreciated.

Thanks!

Mental Clarity; or How to Read Reality Accurately

-10 Hicquodiam 12 April 2012 06:18AM

 

Hey all - I typed this out to help me understand, well... how to understand things:

 

Mental clarity is the ability to read reality accurately. 

 

I don't mean being able to look at the complete objective picture of an event, as you don't have any direct access to that. I'm talking about the ability to read the data presented by your subjective experience: thoughs, sights, sounds, etc. Once you get a clear picture of what that data is, you can then go on and use it to build or falsify your ideas about the world.


This post will focus on the "getting a clear picture" part.


I use the word "read" because it's no different than reading from a book, or from these words. When you read a book, you are actually curious as to what the words are saying. You wouldn't read anything into it that's not there, which would be counterproductive to your understanding.

 

You just look at the words plainly, and through this your mind automatically recognizes and presents the patterns: the meaning of the sentences, their relation to the topic, the visual imagery associated with them, all of that. If you want to know a truth about reality, just look at it and read what's there.


Want to know what the weather's like? Look outside - read what's going on.


Want to know if the Earth revolves around the Sun, or vice versa? Look at the movement of the planets, read what they're doing, see which theory fits better.


Want to check if your beliefs about the world are correct? Take one, read the reality that the belief tries to correspond to, and see how well they compare.


This is the root of all science and all epiphanies.


But if it's so simple and obvious, why am I talking about it?


It's not something that we as a species often do. For trivial matters, sure, for science too, but not for our strongly-held opinions. Not for the beliefs and positions that shape our self-image, make us feel good/comfortable, or get us approval. Not for our political opinions, religious ideas, moral judgements, and little white lies.


If you were utterly convinced that your wife was faithful, moreso, if you liked to think of her in that way, and your friend came along and said she was cheating on you, you'd be reluctant to read reality and check if that's true. Doing this would challenge your comfort and throw you into an unknown world with some potentially massive changes. It would be much more comforting to rationalize why she still might be faithful, than to take one easy look at the true information. It would also more damaging.


Delusion is reading into reality things which aren't there. Telling yourself that everything's fine when it obviously isn't, for example. It's the equivalent of looking at a book about vampires and jumping to the conclusion that it's about wizards.


Sounds insane. You do it all the time. You'll catch yourself if you're willing to read the book of your own thoughts: flowing through your head, in plain view, is a whole mess of opinions and ideas of people, places, and positions you've never even encountered. Crikey!


That mess is incredibly dangerous to have. Being a host to unchecked or false beliefs about the world is like having a faulty map of a terrain: you're bound to get lost or fall off a cliff. Reading the terrain and re-drawing the map accordingly is the only way to accurately know where you're going. Having an accurate map is the only way to achieve your goals.



So you want to develop mental clarity? Be less confused, or more successful? Have a better understanding of the world, the structure of reality, or the accuracy of your ideas? 


Just practice the accurate reading of what's going on. Surrender the content of your beliefs to the data gathered by your reading of reality. It's that simple.

 

It can also be scary, especially when it comes to challenging your "personal" beliefs. It's well worth the fear, however, as a life built on truth won't crumble like one built on fiction.

 

Truth doesn't crumble.

 

Stay true.



Further reading:


Stepvhen from Burning true on truth vs. fantasy.


Kevin from Truth Strike on why this skill is important to develop.

 

Pooling resources for valuable actuarial calculations

12 michaelcurzi 15 February 2012 05:01PM

It occurred to me this morning that, if it's actually valuable, generating true beliefs about the world must be someone's comparative advantage. If truth is instrumentally important, important people must be finding ways to pay to access it. I can think of several examples of this, but the one that caught my attention was actuarial science.

I know next to nothing about what actuaries actually do, but Wikipedia says:

"Actuaries mathematically evaluate the likelihood of events and quantify the contingent outcomes in order to minimize losses, both emotional and financial, associated with uncertain undesirable events."

Why, that sounds right up our alley. 

So what I'm wondering is: for those who can afford it, wouldn't it be worth contracting with actuaries to make important personal decisions? Not merely with regards to business, but everything else as well? My preliminary ideas include:

  • Lifestyle choices to reduce personal risk of death
  • Health and wellness decisions
  • Vehicle choice for economic and safety considerations
  • Where to send your kid to college and otherwise improve life success
Lastly, if consulting actuaries is worth doing as a wealthy individual, shouldn't it also be worth doing as a group? Couldn't we pool money to get excellent information about questions that haven't yielded answers to our research attempts?
If I am not misunderstanding the work that actuaries do, there may indeed be low-hanging fruit here. 

Truth & social graces

6 irrational 22 October 2011 04:28AM

I've seen an article on LW about Santa Claus and most people were very keen on not lying to their kids (and I agree). I have a little kid who is generally quite truthful, innocent enough not to lie in most cases. I noticed recently that when someone asks him, "How are you", he usually answers in detail because, well, you asked, didn't you? When I was a teenager I hated people who lied and I tended to ignore these unwritten social rules to the extent I could. I.e. I didn't ask if I didn't want to know and people thought I was rude. So, my question is, should I teach him to lie upon these occasions?

More broadly, I was thinking, why am I committed to being truthful, in general? I guess because I would hate to be lied to myself. This is a kind of magical thinking maybe, or maybe it's a part of the social contract. This sort of lying in fact promotes the social well-being because to answer truthfully creates an unwelcome burden on my interlocutor who asked out of politeness and is not in truth interested. But it still feels wrong to lie. Even more wrong to teach your kid to do so.

About addition and truth

-1 RolfAndreassen 02 June 2011 08:34PM

This is intended to explore a a thought I had, rather than making any particular argument about truth.

The canonical example of a thing which is true without any obvious physical referent is the statement 2+2=4. It is true about fingers, sheep, particles, and galaxies; but intuitively it does not seem that any of those truths encapsulates the full meaning of the statement. Moreover, it certainly seems that there is nothing anyone could do to make the statement untrue; it seems that it would have to hold in any universe whatsoever.

Now my thought: How do we know that the physical universe operates on this sort of arithmetic, and not arithmetic modulo some obscenely large number? Suppose we repeat the experiment that convinces us 2+2=4 (and let's note that babies are presumably not born knowing this; they learn it by counting on their fingers, even if they do so at too young an age to express it in words), but with much larger integers. Perhaps we might find that, when we take 3^^^^3 particles, and add 1, we are left with 3^^^^3 particles without any awareness that any particles have disappeared. And what is more, if we take three sets of 3^^^^3 particles, and measure their mass separately and then together, we find that we get the same mass. After some long sequence of such experiments, perhaps we might convince ourselves that physics actually operates on integer arithmetic modulo 3^^^^3. (Which would be unexpected in that the physics we know operates on complex numbers, not integers, but perhaps that's an approximation to some fantastically-finegrained two-dimensional integer grid.)

What would this mean, if anything, for the truth of such statements as 2+2=4? It seems that it would then be a contingent truth, not a universal one; that there could in principle exist a universe whose physics operated on arithmetic modulo 3, so that 2+2=1. (Presumably such a universe would not have any sentient beings in it.) What if 2+2=4 is an observed fact about our universe on the same order as the electromagnetic constant or the speed of light?

Rational = true?

4 Student_UK 09 February 2011 09:59AM

For example, if you say, "The rational belief is X, but the true belief is Y" then you are probably using the word "rational" in a way that means something other than what most of us have in mind

This was copied from here.

Surely it is obvious that there are lots of examples when one might say this. Consider this:

Rob looks in the newspaper to check the football scores. The newspaper says that United won 3-2, but it is a misprint because City actually won 3-2. In this case, the rational belief is that United won, but the true belief is that City won.

Am I missing something?