Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

In response to comment by metaman on LessWrong podcasts
Comment author: juanker52 06 October 2017 04:05:36PM 0 points [-]
In response to LessWrong podcasts
Comment author: metaman 05 October 2017 08:45:17PM 1 point [-]

Castify does not appear to have survived? Are the Sequences still out there someplace in audio format?

Comment author: Vaniver 05 October 2017 05:55:43PM 2 points [-]

Our current plan is to send an email with a vote link to everyone over the threshold; we're going to decide when to have the vote later in the open beta period.

Comment author: DragonGod 04 October 2017 09:14:04AM 0 points [-]

if the correct theory were more Kolmogorov-complex than SM+GR, then we would still be forced as rationalists to trust SM+GR over the correct theory, because there wouldn't be enough Bayesian evidence to discriminate the complex-but-correct theory from the countless complex-but-wrong theories.

I reject Solomonoff induction as the correct technical formulation of Occam's razor, and as an adequate foundation for Bayesian epistemology.

Comment author: DragonGod 04 October 2017 09:07:59AM 0 points [-]

Counter example: I changed my epistemology from Aristotelian to Aristotle + Bayes + frequentism.

Comment author: DragonGod 04 October 2017 09:06:54AM 1 point [-]

"Politics is the mindkiller" is an argument for why people should avoid getting into political discussion on Lesswrong; it is not an argument against political involvement in general. Rationalists completely retreating from Politics would likely lower the sanity waterline as far as politics is concerned. Rationalists should get more involved in politics (but outside Lesswrong) of course.

Comment author: TheAncientGeek 03 October 2017 11:37:19AM *  0 points [-]

Well, we don't know if they work magically, because we don't know that they work at all. They are just unavoidable.

It's not that philosophers weirdly and unreasonably prefer intuition to empirical facts and mathematical/logical reasoning, it is that they have reasoned that they can't do without them: that (the whole history of) empiricism and maths as foundations themselves rest on no further foundation except their intuitive appeal. That is the essence of the Inconvenient Ineradicability of Intuition. An unfounded foundation is what philosophers mean by "intuition". Philosophers talk about intution a lot because that is where arguments and trains of thought ground out...it is away of cutting to the chase. Most arguers and arguments are able to work out the consequences of basic intutitions correctly, so disagrements are likely to arise form differencs in basic intuitions themselves.

Philosophers therefore appeal to intuitions because they can't see how to avoid them...whatever a line of thought grounds out in, is definitiionally an intuition. It is not a case of using inutioins when there are better alternatives, epistemologically speaking. And the critics of their use of intuitions tend to be people who haven't seen the problem of unfounded foundations because they have never thought deeply enough, not people who have solved the problem of finding sub-foundations for your foundational assumptions.

Scientists are typically taught that the basic principles maths, logic and empiricism are their foundations, and take that uncritically, without digging deeper. Empircism is presented as a black bx that produces the goods...somehow. Their subculture encourages use of basic principles to move forward, not a turn backwards to critically relflect on the validity of basic principles. That does not mean the foundational principles are not "there". Considering the foundational principles of science is a major part of philosophy of science, and philosophy of science is a philosophy-like enterprise, not a science-like enterprise, in the sense it consists of problems that have been open for a long time, and which do not have straightforward empirical solutions.

Does the use of empiricism shortcut the need for intuitions, in the sense of unfounded foundations?

For one thing, epistemology in general needs foundational assumptions as much as anything else. Which is to say that epistemogy needs epistemology as much as anything else. -- to judge the validity of one system of epistemology, you need another one. There is no way of judging an epistemology starting from zero, from a complete blank. Since epistemology is inescapable, and since every epistemology has its basic assumptions, there are basic assumptions involved in empiricism.

Empiricism specifically has the problem of needing an ontological foundation. Philosophy illustrates this point with sceptical scenarios about how you are being systematically deceived by an evil genie. Scientific thinkers have closely parallel scenarios in which humans cannot be sure whether you are not in the Matrix or some other virtual reality. Either way, these hypotheses illustrate the point that the empiricists are running on an assumption that if you can see something, it is there.

Comment author: Elo 03 October 2017 07:48:49AM 0 points [-]

I think it matters in so far as assisting your present trajectory. Otherwise it might as well be an unfeeling entity.

Comment author: AFinerGrain 03 October 2017 01:54:37AM 0 points [-]

I always wonder how I should treat my future self if I reject the continuity of self. Should I think of him like a son? A spouse? A stranger? Should I let him get fat? Not get him a degree? Invest in stock for him? Give him another child?

Comment author: AFinerGrain 03 October 2017 12:05:59AM 0 points [-]

People say, "no pun intended" because they don't want to be held responsible for the terrible pain puns cause.

Comment author: AFinerGrain 02 October 2017 11:52:39PM 0 points [-]

I originally learned about these ideas from Thinking Fast and Slow, but I love hearing them rephrased and repeated again and again. Thinking clearly often means getting in the cognitive habit of questioning every knee-jerk intuition.

On the other hand, coming from a Bryan Caplan / Michael Huemer perspective, aren't we kind of stuck with some set of base intuitions? Intuitions like; I exist, the universe exists, other people exist, effects have causes, I'm not replaced by a new person with memory implants every time I go to sleep...

You might even call these base intuitions, "magic," in the sense that you have to have faith in them in order to do anything like rationality.

Comment author: Tripitaka 02 October 2017 10:14:54PM 1 point [-]

Reading experience is rather abyssimal. Having just text-on-borderless-white is bad, the font could use some work, any structures for the eye to catch on etc.

Comment author: gwern 02 October 2017 04:00:43PM 0 points [-]
In response to comment by gwern on Magical Categories
Comment author: gwern 01 October 2017 07:30:26PM *  1 point [-]

Another version is provided by Ed Fredkin via Eliezer Yudkowsky in http://lesswrong.com/lw/7qz/machine_learning_and_unintended_consequences/

At the end of the talk I stood up and made the comment that it was obvious that the picture with the tanks was made on a sunny day while the other picture (of the same field without the tanks) was made on a cloudy day. I suggested that the "neural net" had merely trained itself to recognize the difference between a bright picture and a dim picture.

This is still not a source because it's a recollection 50 years later and so highly unreliable, and even at face value, all Fredkin did was suggest that the NN might have picked up on a lighting difference; this is not proof that it did, much less all the extraneous details of how they had 50 photos in this set and 50 in that and then the Pentagon deployed it and it failed in the field (and what happened to it being set in the 1980s?). Classic urban legend/myth behavior: accreting plausible entertaining details in the retelling.

Comment author: wallowinmaya 01 October 2017 11:24:17AM 2 points [-]

The open beta will end with a vote of users with over a thousand karma on whether we should switch the lesswrong.com URL to point to the new code and database

How will you alert these users? (I'm asking because I have over 1000 karma but I don't know where I should vote.)

In response to comment by Elo on LW 2.0 Open Beta Live
Comment author: John_Maxwell_IV 28 September 2017 03:22:53AM 2 points [-]

If this happens, be sure to mark "not spam" so your email provider (Gmail/Yahoo/etc.) will count that as a point of positive reputation for the lesserwrong.com domain.

(For the team behind lesserwrong, it might be wise to send emails from lesswrong.com for the time being, since lesswrong.com presumably already has a good domain reputation. Feel free to talk to me if you have more questions, I used to work in email marketing.)

Comment author: lionhearted 27 September 2017 10:27:44PM 3 points [-]

HUGE kudos and tons of love and respect for everyone behind this. Looks great so far, I'll dig in closer and report anything I find.

Comment author: Vaniver 27 September 2017 09:55:25PM *  2 points [-]

Late May, if I recall correctly. We'll be able to merge accounts if you made it more recently or there was some trouble with the import.

Comment author: trickster 27 September 2017 09:33:53AM 0 points [-]

I think that this has deal with boundeed rationality. Perfect knowledge required endless ammount of time- and all human have only limited lifetime. So, ammount of time for each dessision limited even more. Therefore we can not explore all argument. And I think - it would be a good strategy to throw away some arguments right in the begining and don't waste time on them. Instead you can pay more attention to more plausible one. And this give you a opportunity to build a relatively accurate model of the world in relativly short time. If you not agree- consider this argument. How you can argue against communism if you don't read all the works of Marx/Engels/Lenin/Mao/Trotskiy/Rosa Luxemburg/Bucharin/Zinovev/Stalin/Kautskiy/Sen-Simon....And so on- liste can be endless. I think that such arguments are actually used as a demand that you must slavishly agree with persone who said this. Clearly, this unacceptable and we have right don't agree even if we doesn't read all this volumes- using only limited ammount of data that we already posses. So. if we throw away some idea after first glance- stupidity of followers is not a worst criteria for doing this

In response to 9/26 is Petrov Day
Comment author: ahiskali 26 September 2017 03:20:30PM 2 points [-]

Stanislav Yevgrafovich died 4 month ago, here's NPR article. I wish I knew that he lived near me a little bit sooner.

View more: Prev | Next