All of PeterL's Comments + Replies

PeterL10

I would like to ask whether there is a good reason for step 1 - outline your plan. I think it would be much easier to use, if we started with just a simple plan or even with no plan, and kept improving/extending it by those iterations.

PeterL10

Am I understanding correctly that this idea of wholesomeness is purely definitory/axiomatic (like mathematics) containing no (extraordinary) claims at all, so it doesn't make sense to ask "Is this true?" but rather "Is it useful?", and whether "to act wholesomely is good" is just hypothesis you are even actually testing?

Because then I see its great advantage over religious moral systems, that do contain such claims that actually might be false, but people are demanded to believe them.

2owencb
I think this is essentially correct. The essays (especially the later ones) do contain some claims about ways in which it might or might not be useful; of course I'm very interested to hear counter-arguments or further considerations.
PeterL10

Wow, thanks for your willingness to test/falsify your statements, and I apologize for my rash judgment. Your idea just sounded to me to be too good to be true, so I wanted to be cautious.

And I would be glad to say I am completely satisfied with your answer. However, that is not the case yet, maybe just because the "mistakes" of the people trying to apply wholesomeness might still need a definition - a criterion according to which something is or is not a mistake. 

However, if you provided such a definition, I might be another tester of this style of thinking.

3owencb
The most straightforward criterion would probably be "things they themselves feel to be mistakes a year or two later". That risks people just failing to own their mistakes so would only work with people I felt enough trust in to be honest with themselves. Alternatively you could have an impartial judge. (I'd rather defer to "someone reasonable making judgements" than try to define exactly what a mistake is, because the latter would cover a lot of ground and I don't think I'd do a good job of it; also my claims don't feel super sensitive to how mistakes are defined.)
PeterL10

Could someone, please, confirm or disprove my impression that this idea might be not falsifiable at all? And if it is not, could someone, please, explain to me what reasons to apply this idea are still there (I am skeptical and curious, not completely denying)? 

However, I appreciate this attempt to offer such an interesting moral idea/hypothesis/theory like this.

5owencb
I would certainly update in the direction of "this is wrong" if I heard a bunch of people had tried to apply this style of thinking over an extended period, I got to audit it a bit by chatting to them and it seemed like they were doing a fair job, and the outcome was they made just as many/serious mistakes as before (or worse!). (That's not super practically testable, but it's something. In fact I'll probably end up updating some from smaller anecdata than that.)
PeterL10

Hello, my name is Peter and recently I read Basics of Rationalist Discourse and iteratively checked/updated the current post based on the points stated in those basics:

I (possibly falsely) feel that moral (i.e. "what should be") theories should be reducible because I see the analogy with the demand of "what is" theories to be reducible due to Occam's razor. I admit that my feeling might be false (and I know analogy might not be a sufficient reason), and I am ready to admit that it is. However, despite reading the whole Mere Goodness from RAZ I cannot remem... (read more)

PeterL20

I find splitting the Great Idea a very useful tool to quantify its relevance ("For how many parts do I feel they are true?"), and this way, to apply falsification (for which I find "If your idea wasn't true, how would you find out? Because if you don't ask, it might be not true, but you didn't find out." to be the most logical and intuitive description. And so in case of splitting, you can say "I would admit my idea was incorrect, if not all its 12 parts felt correct separately. Easy.".).

PeterL10

I agree with both "emotion" and "pretend" hypotheses. It is (according to my world view) extremely difficult to pretend emotions you are not possessing. Thus, the easiest way to pretend your beliefs might be to manipulate your own emotions.

PeterL30

Wow, thanks very much. Your post boosted my usage of this handbook rapidly from that day, I intensely enjoy doing these exercises, and I find them extremely helpful and effective. Thanks once again. 

PeterL10

"His own stupid" - the idea that if someone is stupid, he deserves all the bad consequences of being stupid.

Disproof:

Let's assume this is true. Then there would have been at least one voluntary action that turned him from wise to stupid. But why would someone voluntarily choose to be stupid? Only because he wouldn't have known what being stupid means, so he would be already stupid. Thus there would be no such first action. (Assumtion rejected.)

PeterL20

Very nicely written. A good example of this might be invention of genetic flaw correction, due to which morally controversial abortion could become less desired option.

PeterL*10

I am stuck at the prompt no. 1, because I am wondering whether it is possible to name all the wants once forever despite the complexity of human morality.

Thanks in advance for explanation.

4Productimothy
Think pragmatically here. How do you anticipate this list is going to change you? While much of LessWrong is dedicated to hypothetical truths, CFAR is more about iterative/adaptive truth/improvement. Don't consider anything and everything. A threshold of hypotheticals prevent you from acting and making progress (I wish they expounded upon this in the prior post). Just consider the limitations you anticipate that you'll actually be able to/actually want to resolve at some point. Hopefully this gives you some direction.
PeterL10

What about moral duty to be curious?

PeterL10

For everytime I am curious about "how the things are?", I would like to be curious also about "what to do?" then. (Curious pragmatism)

PeterL60

My suggestion for alternative explanation is that people somehow assume that for saving more birds, more people will be asked to donate, so after dividing, the amounts per person will be very similar.

PeterL1
2Truth

I agree that voting might be little bit annoying. 

On the other side, it could potentially make the search for specific qualities of comment much easier if automated (by sorting). (E.g. "Now I am not in the mood for solving difficult concepts so I want something with high clarity evaluation." or "Now I am too tired to argue/fight so I want something empathic now.")

PeterL10

1. entity that regularly makes the acts of changing the owner of object of value from the other entities to self without providing any signal according to that the given other entity could have any reason to hypothesize such change in short term time horizon of its perceptual and cognitive activity.


2. relatively common state of a natural system of currently detecting an internal insufficiency of specific sources interpreting it as the threat to its existence or proper functioning and causing it to perform an attempt to compensate for it and deflect such th... (read more)

PeterL10

Excuse me. What should be easy to remember? Concept names or whole frameworks?

3Pattern
Concept names.
PeterL10

Ok, thanks. This is very interesting, and correct in theory (I guess). And I would be very glad to apply it. But before doing my first steps in it on my own by the trial-&-error method, I would like to know some best practices in doing so, if they are available at all. I strongly doubt this is a common practice in a common population and I slightly doubt that it is the common practice also for a "common" attendee of this forum, but I would still like to make this my (usual) habit.

And the greatest issue I see in this is how to talk to common people arou... (read more)

2gilch
I'm not sure what to say besides "Bayesian thinking" here. This doesn't necessarily mean plugging in numbers (although that can help), but develop habits like not neglecting priors or base rates, considering how consistent the supposed evidence is with the converse of the hypotheses and so forth. I think normal, non-rationalist people reason in a Bayesian way at least some of the time. People mostly don't object to good epistemology, they just use a lot of bad epistemology too. Normal people understand words like "likely" or "uncertain". These are not alien concepts, just underutilized.
PeterL10

OK, thanks, but then one of my additional questions is: what is the reasonable threshold for the probability of my belief A given all available evidence B1, B2, .., Bn? And why?

3gilch
Are you suggesting that beliefs must be binary? Either believed or not? E.g. if the probability of truth is over 50% then you believe it and don't believe if it's under 50%? Dispense with the binary and use the probability as your degree of belief. You can act with degrees of uncertainty. Hedge your bets, for example.
2CraigMichael
I'm not sure what you mean by “threshold for the probability of belief in A.” Say A is “I currently have a nose on my face.” You could assign that .99 or .99999 and either expresses a lot of certainty that it’s true, there’s not really a threshold involved. Say A is “It will snow in Denver on or before October 31st 2021.” Right now, I would assign that a .65 based on my history of living in Denver for 41 years (it seems like it usually does). But I could go back and look at weather data and see how often that actually happens. Maybe it’s been 39 out of the last 41 years, in which case I should update. Or maybe there’s an El Niño-like weather pattern this year or something like that… so I would adjust up or down accordingly. The idea being, overtime, encountering evidence and learning to evaluate the quality of the evidence, you would get closer to the “true probability” of whatever A is. Maybe you’re more asking about how should certain kinds of evidence change the probability of a belief being true? Like how much to update based on evidence presented?
PeterL10

Hello, I would like to ask whether there is any summary/discussion of necessary/sufficient criteria according to which a reason for whatever (belief, action, goal, ...) is sufficient. If not, I would like to discuss it.

2CraigMichael
I'm sure there's people here who could give a better answer. My take would be, from the rationalist/Bayesian perspective, is that you have a probability assigned to each belief based on some rationale, which may be subjective and involve a lot of estimation. The important part is that when new relevant evidence is brought to your attention about that belief, you "update." In the Bayesian sense thinking "given the new evidence B, and the probability of my old belief A, what is the probability of A given B?" But in practice that's really hard to do because we have all of these crazy biases. Scott's recent blog post was good on this point. 
PeterL10

Mr./Mrs. nim, thanks for asking.

By meangfulness of existence, I mean the case related to whole existing reality in whole (eternal) time that would be good to prefer when choosing actions to perform. Or said differently, it would be good to maximize the probability of the meanigfulness by choosing proper actions.

So potentially, the whole existence can be either meaningful or not (indpendently of time variable), and we should make it be the former.

But the meaningfulness of moment is the different term, which I have chosen to be auxiliary in the statements.

I ... (read more)

PeterL10

To simplify my consideration, let's assume it doesn't.

PeterL20

A riddle (maybe trivial for you, lesswrongsters, but I am still curious of your answers/guesses):

It is neither truth nor lie. What is it?

2gilch
1Vaughn Papenhausen
The sentence, "The present king of France is bald."

A rock. A question. A command. A mistake. A dream. "2+2=4" spoken by a zombie. The output of GPT-3. Simulacrum level 4. The fit of water to a puddle. The peacock's tail.

7gjm
A hamburger.
2lsusr
PeterL10

Yes, but it can happen that in the time course of our individual existence two "justified opinions" inconsistent with each other can occur in our minds. (And if they didn't, we would be doomed to believe all flawed opinions from our childhood without possibility to update them because of rejecting new inconsistent opinions, etc.)

And morover, we are born with some "priors" which are not completely true but relatively useful. 

And there are some perceptual illusions. 

And prof. Richard Dawkins claims that there are relatively very frequent hallucinat... (read more)

2jsalvatier
>Thus, I think that the process is relatively reliable but not totally reliable.  Absolutely. That's exactly right.  >My Christian friend claimed that atheists/rationalists/skeptics/evolutionists cannot trust even their own reason (beacuse it is the product of their imperfect brains in their opinion). It sounds like there's a conflation between 'trust' and 'absolute trust'. Clearly we have some useful notion of trust because we can navigate potentially dangerous situations relatively safely. So using plain language its false to say that atheists can't trust their own judgement. Clearly they can in some situations. Are you saying atheists can't climb a ladder safely?  It sounds like he wants something to trust in absolutely. Has he faced the possibility that that might just not exist?
PeterL10

Ok, I will put it a little bit straightforward.

My Christian friend claimed that atheists/rationalists/skeptics/evolutionists cannot trust even their own reason (beacuse it is the product of their imperfect brains in their opinion).

So I wanted to counterargue reasonably, and my statement above seems to me a relatively reasonable and relevant. And I don't know whether it would convince my Christian friend, but it is convincing at least me :) .

Thanks in advance for your opinions, etc.

2ChristianKl
Part of being a good skeptic is being skeptical of one's own reasoning. You need to be skeptical of your own thinking to be able to catch errrors in your own thinking. 
3Viliam
Well, I don't. But at the end of the day, some choices need to be made, and following my own reason seems better than... well, what is the alternative here... following someone else's reason, which is just as untrustworthy. Figuring out the truth for myself, and convincing other people are two different tasks. In general, truth should be communicable (believing something for mysterious inexplicable reasons is suspicious); the problem is rather that the other people cannot be trusted to be in a figuring-out-the-truth mode (and can be in defending-my-tribe or trying-to-score-cheap-debate-points mode instead).
2jsalvatier
Consider how justified trust can come into existence.  You're traveling through the forest. You come to moldy looking bridge over a ravine. It looks a little sketchy. So naturally you feel distrustful of the bridge at first. So you look at it from different angles, and shake it a bit. And put a bit of weight on it. And eventually, some deep unconscious part of you will decide that it's either untrustworthy and you'll find another route, or it will decide its trustworthy and you'll cross the bridge.  We don't understand that process, but its reliable anyway.
PeterL30

Hello, lesswrongsters (if I can call you like this),

What do you think about the following statement: "You should be relatively skeptical about each of your past impressions, but you should be absolutely non-skeptical about your most current one at a given moment. Not because it was definitely true, but because there is practically no other option."

Please, give me your opinions, criticism, etc. about this.

2hamnox
You should be skeptical about your most current one! It is likely better informed than previous ones, but that doesn't mean you're done processing. BUT, you need to exercise that skepticism by knowing what your best understanding strongly predicts and what discrepancies should surprise you, not by trying to make yourself give humbler answers.
1PeterL
Ok, I will put it a little bit straightforward. My Christian friend claimed that atheists/rationalists/skeptics/evolutionists cannot trust even their own reason (beacuse it is the product of their imperfect brains in their opinion). So I wanted to counterargue reasonably, and my statement above seems to me a relatively reasonable and relevant. And I don't know whether it would convince my Christian friend, but it is convincing at least me :) . Thanks in advance for your opinions, etc.
2lsusr
The first sentence of the quote sounds like a mix of the Buddhist concept of the now plus the financial concept of how the current price of a security reflects all information about its price.
PeterL10

I would like to ask you, whether there are some criteria (I am fine even with the subjective ones) according to which you, experienced rationalists, would accept/consider some metaethics despite the very bad humankind's experience with them.

I expect answers like: convincing; convincing after very careful attempt to find its flaws; logical;  convincing after very careful attempt to find its flaws by 10 experienced rationalists; after careful questioning; useful; harmless; etc.

PeterL30

Hello, I would like to ask, whether you think that some ideas can be dangerous to discuss publicly despite you are honest with them and even despite you are doing your best attempt to be logical/rational and even despite you are wishing nothing bad to other people/beings and even despite you are open for its discussion in terms of being prepared for its rejection according to a justified reason.

In this stage, I will just tell you I would like to discuss a specific moral issue, which might be original, and therefore I am skeptical this way and I feel a little insecure about discussing it publicly. 

2lsusr
Yes I do.
1seed
Maybe then discuss it privately with a few people first?
2Ben Pace
There is information that's dangerous to share. Private data, like your passwords. Information that can be used for damage, like how to build an atom bomb or smallpox. And there will be more ideas that are damaging in the future. (That said I don't expect your idea is one of these.)
PeterL*10

Hello, I would like to ask a straightforward question: Is there a logically valid way to call evolution science and creationism a pseudo-science? I am not creationist, I just hope I will strengthen my evolutionistic opinion. And I would like to have it in a clear form. I must admit that because of my family member I visited a specific creationistic site. And there was the extraordinary claim that there is no such logically valid way and I would like to disprove this for me and my family member by stating at least one. And I want to stop visiting that site ... (read more)

2gilch
This sounds like an XY problem. What you're looking for isn't logic, but epistemology. Science is, and always has been, Natural Philosophy. Now, philosophy can be done well or badly. But Natural Philosophy has a great advantage over other philosophical branches: better data. This makes such a difference that scientists often don't think of themselves as philosophers anymore (but they are). What makes the modern Scientific Method especially effective compared to older approaches is the rejection of unreliable epistemology. The whole culture of modern science is founded on that. Sure, we have better tools and more developed mathematics now, but the principles of epistemology we continue to use in science were known anciently. The difference is in what methods we don't accept as valid. So is Creationism a philosophy of nature? Sure is! But they're making embarrassing, obvious mistakes in epistemology that would make any real scientist facepalm, and yet call themselves "scientific". That's why we call it pseudoscience. If you're trying to win an argument with a creationist family member, presenting them evidence isn't going to help as long as their epistemology remains broken. They'll be unable to process it. Socratic questioning can lead your interlocutor to discover the contradictions their faulty methods must lead them to. Look up "street epistemology" for examples of the technique.
4TAG
The question of what constitutes science is called the demarcation problem. It doesn't have a generally accepted solution , but it doesn't follow from that that creationism is science. Pragmatically, science is what scientists say it is, and very few scientitists are creationists.
Raemon120

Hey Peter, (I'm a mod here)

I think this is a reasonable question to have, but is something that LessWrong has kinda tried to move on from as a whole, so you may not find many people interested in answering it here. (i.e. LessWrong doesn't focus much on traditional skeptic/woo debates, instead trying to settle those questions so we can move on to more interesting problems)

Just wanted to let you know. I do think reading through many of the core sequences may help. (Unfortunately I don't currently know which sub-sections will be most relevant to you offhand)

Is there a logically valid way to call evolution science and creationism a pseudo-science?

I'm not sure what's meant by "logically valid" here. Formal logic does have a notion of validity, but it's not particularly useful for these kinds of questions. If I just came up with a definition for science and said that evolution matches that definition, thus therefore evolution is a science, it would be a logically valid argument... even if my argument made no sense.

For instance, if I said that "scientific fields are those which invol... (read more)

PeterL40

Hello, I wanted to post something but when reading the guidelines, I am a little confused. The issue of my confusion is "Aim to explain, not persuade". English is not my native language, but when I googled "persuade" I found that it means "induce (someone) to do something through reasoning or argument. ". To me, this sounds a little ridiculous on the rationality forum.

4Ruby
Welcome! The dictionary definition of "persuade" misses some of the connotations. Persuading someone often means "get them to agree with you" and not "jointly arrive at what's true, which includes the possibility that others can point out your mistakes and you change your mind." Explaining usually means more something like "explain your reasoning and facts, which might lead someone to agree with if they think your reasoning is good." The key difference might be something like "persuade" is written to get the reader to accept what is written regardless of whether it's true, while "explain" wants you to accept the conclusion only if it's true. It's the idea symmetric/asymmetric weapons in this post. Sorry if that's still a bit unclear, I hope it helps.