in regard to: http://lesswrong.com/r/discussion/lw/nv8/do_you_want_to_be_like_kuro5hin_because_this_is/
While we are working on a solution, you can go to your preferences, and change the option:
Don't show me articles with a score less than:
to blank.
here are my recent posts:
There are more, but you can also go to my (new) website - http://www.bearlamp.com.au and see them all.
Upvote post for those who have been hit by collateral damage in the Elo / Nier war, as it happened to me. I promise to upvote everyone who writes under this something that is not controversial.
If the problem is the sockpuppet army, while coders create a systematic solution, the community can help and show that the army of goodwilling men and women are stronger than any puppet master.
We won the war against Eugene... for a brief instant.
I'm keeping score and calculating the number of downvotes/upvotes on the comment where I requested help against Eugene Nier downvotes campaign.
Well, there was a moment where 14 people upvoted and 20 puppets downvoted. Now we are at a point where 21 people upvoted and 30 puppets downvoted. This means that at least we forced Eugene to increase the count of his puppets to fight back. I count this as score for LW :)
#makelwniceagain
I don't think that opposing strategic voting by strategic voting is an improvement. (Noise + more noise != signal.) I also don't see how forcing Eugine to increase the number of sockpuppets is a good thing, especially if the difference is between 20 and 30.
Thanks for trying! I just think this is a wrong direction.
Here's the problem with talking x-risk with cynics who believe humanity is a net negative, and also a couple possible solutions.
Frequently, when discussing the great filter, or averting nuclear war, someone will bring up the notion that it would be a good thing. Humanity has such a bad track record with environmental responsibility or human rights abuses toward less advanced civilizations, that the planet, and by extension the universe, would be better off without us. Or so the argument goes. I've even seen some countersignaling severe enough to argue, somewhat seriously, in favor of building more nukes and weapons, out of a vague but general hatred for our collective insanity, politics, pettiness, etc.
Obviously these aren't exactly careful, step by step arguments, where if I refute some point they'll reverse their decision and decide we should spread humanity to the stars. It's a very general, diffuse dissatisfaction, and if I were to refute any one part, the response would be "ok sure, but what about [lists a thousand other things that are wrong with the world]". It's like fighting fog, because it's not their true objection, at least not quite. It's not like either of u...
I think there's also a near/far thing going on. I can't find it now, but somewhere in the rationalist diaspora someone discussed a study showing that people will donate more to help a smaller number of injured birds. That's one reason why charity adds focus on 1 person or family's story, rather than faceless statistics.
Combining this with what you pointed out, maybe a fun place to take the discussion would be to suggest that we start with a specific one of our friends. "Exactly. Let's start with Bob. Alice next, then you. I'll volunteer to go last. After all, I wouldn't want you guys to have to suffer through the loss of all your friends, one by one. No need to thank me, it is it's own reward."
EDIT: I was thinking of scope insensitivity, but couldn't remember the name. It's not just a LW concept, but also an empirically studied bias with a Wikipedia page and everything.
However, I mis-remembered it above. It's true that I could cherry pick numbers and say that donations went down with scope in one case, but I'm guessing that's probably not statistically significant. People are probably willing to donate a little more, not less, to have an impact a hundred times as large. P...
Man burns down house remotely over the internet, for insurance, no accident.
Edit: was only posited, but ivestigators rigged up the supposed instrument of doom, a network printer with a piece of string.
Anyone know where I can find melatonin tablets <300 mcg? Splitting 300 mcg into 75 mcg quarters still gives me morning sleepiness, thinking smaller dose will reduce remaining melatonin upon wake time. Thanks.
Software to measure preferences?
I have a set of questions, in which a person faces a choice, which changes the odds of two moderately-positive but mutually-exclusive outcomes. Eg, with Choice #1, there is a 10% chance of X and a 20% chance of Y, while with Choice #2, there is a 15% chance of X and a 10% chance of Y. I want to find out if there are any recognizable patterns in which options the agent will choose. Is there any software already freely available which can be used to help figure this out?
"Why Should I Trust You?": Explaining the Predictions of Any Classifier
"various scenarios that require trust: deciding if one should trust a prediction, choosing between models, improving an untrustworthy classifier, and identifying why a classifier should not be trusted. "
I think it would be interesting if we weigh the benefits of human desire modification in all its forms (ranging from strategies like delayed gratification to brain pleasure centre stimulation: covered very well in this fun theory sequence article ) against the costs of continuous improvement.
Some of these costs :
Hi, I'm curious what rationalists (you) think of this video if you have time:
Why Rationality Is WRONG! - A Critique Of Rationalism https://www.youtube.com/watch?v=iaV6S45AD1w 1 h 22 min 47 s
Personally, I don't know much about all of the different obstacles in figuring out the truth so I can't do this myself. I simply bought it because it made sense to me, but if you can somehow go meta on the already meta, I would appreciate it.
I tried listening to the video on the 1.5× speed. Even so, the density of ideas is horribly low. It's something like:
Science is successful, but that makes scientists overconfident. By 'rationalists' I mean people who believe they already understand everything.
Those fools don't understand that "what they understand" is just a tiny fraction of the universe. Also, they don't realize that the universe is not rational; for example the animals are not rational. Existence itself has nothing to do with rationality or logic. Rationalists believe that the universe is rational, but that's just their projection. Rationality is an emergent property. Existence doesn't need logic, but logic needs existence, therefore existence is primary.
You can't use logic to prove whether the sun is shining or not; you have to look out of the window. You can invent an explanation for empirical facts, but there are hundreds of other equally valid explanations.
That was the first 16 minutes, then I became too bored to continue.
My opinion?
Well, of course if you define a "rationalist" as a strawman, you can easily prove the strawman is foolish. You don't need more than one hour to convince me...
I agree with the other commenters about this.
So I thought "maybe it gets more interesting later on" and skipped to 50:00. At which point he isn't bothering to make any arguments, merely preening over how he understands the world so much more deeply than rationalists, who will come and bother him with their "arguments" and "contradictions" and he can just see that they "haven't got any awareness" and trying to engage with them would be like trying to teach calculus to a dog, and that the mechanism used to brainwash suicide bombers and fundam...
Oh god. This is really bad.
Someone should tell him about the straw vulcan.
The more we (lw'ers) are tied to the word "Rationality". That should happen less. If you feel personally affected by the idea that someone says this part of your identity is wrong, then maybe it's time to be more fox and less hedgehog.
"Researchers discover machines can learn by simply observing, without being told what to look for"
Giving "rewards" for discovering rules, Turing Learning.
http://sciencebulletin.org/archives/4761.html
http://link.springer.com/article/10.1007%2Fs11721-016-0126-1
And China and Russia have the best coders for algorithms
https://arc.applause.com/2016/08/30/best-software-developers-in-the-world/
can anyone get this page to open? It's a stanford report on AI, all 2,800 pages...
My girlfriend and I disagreed about focussing on poor vs richer countries in terms of doing good. She made an argument along the lines of:
'In poorer countries the consumer goods are targeted to that class of poor people so making difference in inequality in places like Australia is more important than in poor countries because they are deprived of a supply of goods because the consumer culture is targeted towards the wealthier middle class.'
What do you make of it?
But "rationalism" or "rationality" in, say, the sense commonly used on LW does not in fact mean denying any of that.
But that's what you're mostly doing in your post. I will bring this up below.
The guy in the video comes across (to me at least) as smugly superior even while uttering a succession of trivialities, which doesn't do much to encourage me to watch more.
I don't think everyone shares that view, at least it's not for me. I don't know if I am contradicting myself, though. If someone was similar but in differing in opinion then me. The contradiction would then lie under if I told you the world is your mirror.
So I thought "maybe it gets more interesting later on" and skipped to 50:00. At which point he isn't bothering to make any arguments, merely preening over how he understands the world so much more deeply than rationalists, who will come and bother him with their "arguments" and "contradictions" and he can just see that they "haven't got any awareness" and trying to engage with them would be like trying to teach calculus to a dog, and that the mechanism used to brainwash suicide bombers and fundamentalists are "the exact same mechanism that very intelligent scientists use to prove their theories of space and time and whatever else". OK, then.
That's what he said, of course it's kind of harsh, but it's his way of going on these things I think, I don't know why or what's most effective but for myself I am unaffected or in the positive. That might be just because I agree.
Since I obviously wasn't enlightened enough for minute 50 of this thing, I went back to 40:00. He says it's important to connect with your emotions and not deny they're there (OK), and then he says that "rational people just assume that, well, we don't need any of that emotional stuff". OK, then. (And rational people like scientists get emotional when they argue with highly irrational people because they're attached to their rational models of the world and don't want to hear anything contrary to those models because of cognitive dissonance; they close their eyes and ears to the arational because they demonize it as irrational.)
By becoming aware of the emotions that you are suppressing, not the "feeling emotions" rationally because the reason of emotion is rational.
OK, clearly still too advanced for me. Back to 30:00. Apparently, if your "awareness" is low then you think thinking is great (OK...), you think thinking is all there is (huh?)
There is awareness of thoughts, not only thoughts, and the awareness is not a thought. That is a definition game of what is a thought, consider it being different from awareness.
Yes, you don't have a thought of a thought, you have awareness of thought. Otherwise, you're trapped in thinking and don't know that there is something else.
, you think thinking is a powerful tool for understanding reality (OK...), but as you gain in "awareness" you realise that thinking is a system of symbols, and "this gulf between the map and the territory just grows wider and wider and wider, until you see that the map is just a complete fiction, a complete illusion", and once you realise this you see "the gross limitations of thinking".
Einstein's theory of gravity isn't revealing anything deep about the world, it's just a set of sounds and symbols on paper. "That's what it literally is, except your awareness is too low to actually see that". And then he pulls an interesting move where he complains about people with "low" "awareness" getting "sucked into the content" of a theory because they don't see the "larger context".
See how he never mentions the larger context of an understanding of relativity itself? But the context of which sounds and symbols make up our "reality".
You might think he's now going to explain what the larger context is and how it should affect our understanding of relativity. Ha, ha. What a silly idea. Only someone with low awareness would expect that. What he actually does is to tell us how when rationalists criticize him they're doing it "on the level of thoughts" while he is "on the level of awareness, which is a much higher level". Bleh.
You missed the point, there was nothing said about affecting the understanding of relativity, you fell into the exact paradigm which the video said.
The larger context of the symbols and sounds on the paper. Not the theory itself according to physicists. That's the matrix.
Oh, wait, he has something resembling an actual point somewhere around 35:00. Rationalists give too much credit to logic, he says, because logic "has no teeth", because it depends on its premises and the premises are doing the real work, and if your premises are dodgy then so are your conclusions, and "most of them are very very wrong". Cool, he's going to tell us what wrong premises we have. ... Oh, no, silly me, he isn't. He just says they're very wrong but gives no specifics.
He gave the specifics right after that, rationality itself. Asking about the premises which make rationality possible.
Saying things that are true but elementary and not in fact denied by rationalists. For some of these, he actually gives some kind of justification.
It seems like you disagree on numerous points, but not being aware of it. Like Einstein's equation is simply symbols and sounds (and pretty much everything else which you give attribute to)
Saying that rationalists are wrong in various ways (giving too much weight to X, having wrong premises, ...). In every instance of this I heard (though I have not listened to the whole dreary thing) either the claim is flatly wrong, or he offers no sort of support for it, or both.
Let's say the rational mind cannot understand something, why continue to use the rational mind? Is there something else? Maybe awareness? There might be something worth pursuing there.
Now I know I am not responding to my quote of your text. Rationality is wrong because of rationality itself. It cannot be right without the right context. The context of which rationality exists. Where thinking exists. Which is "outside" the subjective experience according to you. That's the whole point. It's right under your nose if you'd bother to meditate and separate awareness from thoughts.
Saying smugly how much more "aware" he is than rationalists are, and how this puts him on a higher level than them. If there's anything actually useful there, I missed it. And now I've listened to enough of this without any sign that he has anything useful to teach me, and I'm going to go and do something else. My apologies for not sitting through all 82 minutes of it.
Well. You're capable of becoming aware as well. It's not a radical difference. :)
By becoming aware of the emotions that you are suppressing, not the "feeling emotions" rationally because the reason of emotion is rational.
Suppressing emotions has nothing to do with rationality as understood by this community. We aren't straw vulcans. Giving a speech of why straw vulcanism is bad, is no speech that provides a good critique of what we consider rationalism to be.
If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "