I've found it best to avoid the word "truth" whenever possible. The concept of "truth" implies an objective reality exists and that you know about it. Since we may be in a simulation, in the imagination of a god, or just hallucinating, we can never really be sure about "truth" and I find it boring to play semantic games in order to better hedge the word.
I find it much better to just focus on predictions and beliefs with explicit levels of confidence.
If you're talking about whether the sun rises tomorrow, and you say you predict that it will rise with high confidence, and your interlocutor responds, "That's not my truth," then you can just ask them to break that down into a prediction. Are they saying the sun won't rise? If so, okay, you can test that.
If the disagreement is over something that can't practically be tested, you can still interrogate their concrete predictions and see where they disagree with yours.
Religious people love talking about Truth because it is so confusing. I can't nail you down and show where you're wrong if you refuse to be concrete, so if you don't want to be shown to be wrong, just talk about abstract Truth.
Depends on the context, obviously, but my first interpretation would be "My values are not your values". In popular usage "truth" means more than empirically proven facts about the objective reality -- e.g. people routinely call "truth" what they believe not only in the descriptive but also in the normative sense.
I would recommend making clear two separations: between descriptive ("US economic growth has been slow recently") and normative ("We need to accelerate the US economic growth"); and between facts ("The US GDP grew by 2.4% in 2015"), preferences("Fighting inequality is more important than gross economic growth"), and forecasts, often conditional ("We can accelerate the economic growth by cutting taxes").
Well, if your justifications are truly marvelous but the margin of this post is too narrow to contain them, you are basically asking everyone to trust you that you know what you're talking about. This makes it an argument by reputation (or, in a slightly more pronounced form, an argument by authority).
I am fairly confident that you have justifications you haven't bothered stating. But that's not the question, the question is whether they are good justifications and this is a much more complicated matter.
You have failed to answer my question. Why does anything at all matter? Why does anything care about anything at all? Why don't I want my dog to die? Obviously, when I'm actually dead, I won't want anything at all. But there is no reason I cannot have preferences now regarding events that will occur after I am dead. And I do.
Where I live, people sometimes organize "markets" where they bring stuff that is potentially useful but they have no use for it. Everyone brings whatever they want, and everyone takes whatever they want (first come, first served). Sometimes there is a specific topic, e.g. "clothes" or "stuff for kids", sometimes there is no topic.
In theory, I would expect that such place would attract e.g. all homeless people around, which could make it quite unpleasant for other participants. But in practice, this doesn't happen, probably because those activities are usually organized online or through personal lines, so it's mostly middle-class people coming there, and many of them bring more than they take. Usually people take home all the stuff they brought but nobody else wanted; but sometimes there is an explicit rule (e.g. with the clothes) that at the end, all the untaken stuff will be collected by the organizers and donated to some charity (so it will "trickle down" towards poorer people until someone takes it).
So, if this is important for you, I recommend first doing some research (online, asking your neighbors), and if you can't find, maybe you can organize it. Find a few people to help you, rent a room with some tables (is best case, some organization sympathetic to your goals would lend you the room for free), send invitations on facebook. Call it a "no-money market" or "neighbors' exchange" or whatever. Maybe the first time you organize it, make sure you have at least five people who don't know each other and want to get rid of some potentially useful stuff.
Native Americans were "neutralized" mostly as a side effect of the diseases brought by colonists, and then outcompeted by economically more successful cultures. Instead of strategic effort to prevent WW1 and WW2 happening on another continent, settlers from different European nations actually had "violent clash over resources" with each other. (also here)
The reasoning may seem sound, but it doesn't correspond to historical facts.
I thought more after I posted and concluded that:
Most likely the energy will be released below sun’s photosphere, as its density is very low like 1 to 6000 of air. This would prevent immediate flash visibility.
The resulting hot gas will flow up eventually but it will cooler and energy less concentrated. But even if it takes several minutes, it still could produce burns on Earth.
Also something like large Solar flash could happen because of integration of the hot gas from the comet with Sun's magnetic field, and it hypothetically will result in superflare with strong Solar wind and magnetic effect on Earth.
The temperature during impact will be around 5 mln K on the edge of the comet, as I calculated, which is not enough for any meaningful nuclear reactions. But it doesn't include any additional heating connected with rising pressure because - and pressure would rise as the comet will compress as it decelerate in the solar medium.
If such reaction will happen it could add more energy to explosion and also produce some radioactive isotopes, which could later become part of Solar find and fallout on Earth. I saw an article long time before about possibility of nuclear reaction during impacts, and I will find it.
Insert peg A into slot B. Pleasure should ensue for both parties. Follow emergent heuristics.
If pleasure is not evoked or in case of mismatching heuristics, try to vary peg and/or slot and/or frequency/speed/depth of insertion.
In case of further problems please call your local support.
Naturally if I were mistaken it would be appropriate to concede that I was mistaken. However, it was not about being mistaken. The point is that in arguments the truth is rarely all on one side. There is usually some truth in both. And in this case, in the way that matters, namely which I was calling important, it is not possible to accidentally wipe out alien civilizations. But in another way, the unimportant way, it would be possible in the scenario under consideration (which scenario is also very unlikely in the first place.)
In particular, when someone fears something happening "accidentally", they mean to imply that it would be bad if that happened. But if you accidentally fulfill your true values, there is nothing bad about that, nor is it something to be feared, just as you do not fear accidentally winning the lottery. Especially since you would have done it anyway, if you had known it was contained in your true values.
In any case I do not concede that it is contained in people's true values, nor that there will be such an AI. But even apart from that, the important point is that it is not possible to accidentally wipe out alien civilizations, if that would be a bad thing.
Because you wrote one sentence without actually giving the argument. So I went with my prior on your argument.
That's what I'm suggesting you not do.
Writing out arguments, and in general, making one's thought processes transparent, is a lot of work. We benefit greatly by not having a norm of only stating conclusions that are a small inferential distance away from public knowledge.
I'm not saying you should (necessarily) believe what I say, just because I say it. You just shouldn't jump to the conclusion that I don't have justifications beyond what I have stated or am willing to bother stating.
Cf. Jonah's remark:
If I were to restrict myself to making claims that I could substantiate in a mere ~2 hours, that would preclude the possibility of me sharing the vast majority of what I know.
You make the decision to send the resources necessary to transform a galaxy without knowing much about the galaxy. The only things you know are based on the radiation that you can pick up many light years away.
Once you have sent your vehicle to the galaxy it could of course decide to do nothing or fly into the sun but that would be a waste of resources.
I think we can all agree that an entity's anticipated future experiences matter to that entity. I hope (but would be interested to learn otherwise) that imaginary events such as fiction don't matter. In between, there is a hugely wide range of how much it's worth caring about distant events.
I'd argue that outside your light-cone is pretty close to imaginary in terms of care level. I'd also argue that events after your death are pretty unlikely to effect you (modulo basilisk-like punishment or reward).
I actually buy the idea that you care about (and are willing to expend resources on) subjunctive realities on behalf of not-quite-real other people. You get present value from imagining good outcomes for imagined-possible people even if they're not you. This has to get weaker as it gets more distant in time and more tenuous in connection to reality, though.
But that's not even the point I meant to make. Even if you care deeply about the far future for some reason, why is it reasonable to prefer weak, backward, stupid entities over more intelligent and advanced ones? Just because they're made of similar meat-substance as you seems a bit parochial, and hypocritical given the way you treat slightly less-capable organic beings like lettuce.
Woodchopper's post indicated that he'd violently interfere with (indirectly via criminalization) activities that make it infinitesimally more likely to be identified and located by ETs. This is well beyond reason, even if I overstated my long-term lack of care.
View more: Prev
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)