Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: gjm 18 November 2015 12:06:58PM 6 points [-]

I don't spend enough of my time reading the results of studies that you should necessarily pay much attention to what I think. But: you want to know what information it gives you that the study found (say) a trend with p=0.1, given that the authors may have been looking for such a trend and (deliberately or not) data-mining/p-hacking and that publication filters out most studies that don't find interesting results.

So here's a crude heuristic:

  • There's assorted evidence suggesting that (in softish-science fields like psychology) somewhere on the order of half of published results hold up on closer inspection. Maybe it's really 25%, maybe 75%, but that's the order of magnitude.
  • How likely is a typical study result ahead of time? Maybe p=1/4 might be typical.
  • In that case, getting a result significant at p=0.05 should be giving you about 4.5 bits of evidence but is actually giving you more like 1 bit.
  • So just discount every result you see in such a study by 3 bits or so. Crudely, multiply all the p-values by 10.

You might (might!) want to apply a bit less discounting in cases where the result doesn't seem like one the researchers would have been expecting or wanting, and/or doesn't substantially enhance the publishability of the paper, because such results are less likely to be produced by the usual biases. E.g., if that p=0.1 trend is an incidental thing they happen to have found while looking at something else, you maybe don't need to treat it as zero evidence.

This is likely to leave you with lots of little updates. How do you handle that given your limited human brain? What I do is to file them away as "there's some reason to suspect that X might be true" and otherwise ignore it until other evidence comes along. At some point there may be enough evidence that it's worth looking properly, so then go back and find the individual bits of evidence and make an explicit attempt to combine them. Until then, you don't have enough evidence to affect your behaviour much so you should try to ignore it. (In practice it will probably have some influence, and that's probably OK. Unless it's making you treat other people badly, in which case I suggest that the benefits of niceness probably outweigh those of correctness until the evidence gets really quite strong.)

Comment author: TylerJay 18 November 2015 06:30:26PM 2 points [-]

Thank you! This is exactly what I was looking for. Thinking in terms of bits of information is still not quite intuitive to me, but it seems the right way to go. I've been away from LW for quite a while and I forgot how nice it is to get answers like this to questions.

Comment author: TylerJay 18 November 2015 02:29:27AM *  3 points [-]

I'm curious about how others here process study results, specifically in psychology and the social sciences.

The (p < 0.05) threshold for statistical significance is, of course, completely arbitrary. So when I get to the end of a paper and the result that came in at, for example, (p < 0.1) is described as "a non-significant trend favoring A over B," part of me wants to just go a head and update just a little bit, treating it as weak evidence, but I obviously don't want to do even that if there isn't a real effect and the evidence is unreliable.

I've found that study authors are often inconsistent with this—they'll "follow the rules" and report no "main effect" detected when walking you through the results, but turn around and argue for the presence of a real effect in the discussion/analysis based on non-individually-significant trends in the data.

The question of how to update is further compounded by (1) the general irreproducibility of these kinds of studies, which may indicate the need to apply some kind of global discount factor to the weight of any such study, and (2) the general difficulty of properly making micro-adjustments to belief models as a human.

This is exactly the situation where heuristics are useful, but I don't have a good one. What heuristics do you all use for interpreting results of studies in the social sciences? Do you have a cutoff p-value (or a method of generating one for a situation) above which you just ignore a result outright? Do you have some other way of updating your beliefs about the subject matter? If so, what is it?

Comment author: James_Miller 18 July 2015 09:54:41PM 6 points [-]

The iterated "but how do you know that" also works as a FGC, or gets your opponent to admit he is relying on an unjustified assumption.

Comment author: TylerJay 19 July 2015 10:46:32AM *  3 points [-]

This is a good one. More generally, it's sometimes called the "Why" Regress. Not just about how you know something, but about how something happened or came to be. It applies equally to science and religion.

Edit: "...know you know" => "...how you know"

Comment author: TylerJay 19 July 2015 10:43:49AM *  4 points [-]

I like this idea, and I'd like to see it developed further. I don't see any reason why FGCAs shouldn't be catalogued and learned alongside logical fallacies for the same reasons.

I guess the important distinction would be that certain FGCAs can be used non-fallaciously, and some of these seem to have valid use-cases, like pointing out confirmation bias and mind-projection fallacy. Others are fallacious in their fully-general form, but have valid uses in their non-fully-general forms, so it is important to distinguish these. (e.g. pointing out vagueness or that something is too complicated and has too many dependencies for a given argument to have much weight.)

Great post!


I apologize for mentioning this, but there were a lot of typos in this, which made it a bit hard to read. I want to link this to a few friends who are not LWers, but when I am not familiar with the source of something, typos make me question the credibility of the author (they also provide an easy excuse to discount things people don't want to hear). I don't want that to happen when I show people, so I figured I'd help you out if you feel like cleaning it up a bit. Here's a quick list I put together for you:

  • Add comma after "But if (s)he is not aware of that"
  • Change "prone of" to "prone to"
  • "counter measures" should be "countermeasures"
  • "against which" should be "against whom" in "...clever arguer against which"
  • Add comma after "humble stance"
  • Change "FCGA" to "FGCA" in first bullet of The List and in first sentence under headings of both Self-Sealing Belief and Preventative Action
  • The third bullet is empty and fourth bullet seems like it is supposed to contains sub-cases of the missing third bullet
  • Under Nihilism, "Live" should be "Life" and the "-" needs to be closed after "including arguments"
  • "I don't like your opinion but I you are may have your own."
  • "Yur" => "Your" after "Humans are different"
  • In "the thing you are arguing about has evolved and exists just because of that, not because it is true or a valid argument." Just because of what? Evolution?
  • Opened paren but no contents or close-paren: "...more likely and more stable ("
Comment author: [deleted] 27 May 2015 11:02:27AM 1 point [-]

Old topic, but what is your stance on IF and coffee? Some people need coffee for their morning bowel movement. Ideally it would be black coffee but that is really bitter. Berkhan and Pilon agree that artificial sweeteners are probably okay, although I suspect the sweet taste in itself may create an insulin response. Berkhan wrote somewhere one teaspoon of milk is okay as it should not yet launch an insulin response. What do you think?

In response to comment by [deleted] on [question] Recommendations for fasting
Comment author: TylerJay 29 May 2015 02:12:34AM *  0 points [-]

I used to drink coffee every day, but I don't anymore. I just drink green tea in the mornings if I want something hot. I definitely don't think it's worth risking the benefits of your fast by using sugar or milk in your coffee. If I recall correctly, Berkhan's assertion that half a teaspoon (or whatever it was) of milk wouldn't cause a problem wasn't really supported by any science, so I would avoid it if possible. I think his reasoning was that your body would metabolize it super quickly and then return to a fasted state, but it's not clear if you'll retain the benefits of a 16-hour fast that way. I suspect it would also increase food cravings during the rest of your fast. And the increase in taste of your coffee is such a minor benefit that it's just totally not worth the risk as far as I'm concerned. There are better ways of making black coffee taste good if it's that important to you (see below).

I agree that the sweet taste of artificial sweeteners probably does something counterproductive. Overweight soda-drinkers who switch to diet soda have been shown not to lose weight. That's proof enough for me that they work some sort of mischief on your metabolism. All in all, it's probably not going to really cause a noticeable difference, but I feel like they're worth avoiding for general health reasons anyway. Starting a daily artificial sweetener habit as a part of trying to get healthier with an IF protocol seems counterproductive to me. I'd avoid them.

As I mentioned above, it's totally possible to make great-tasting black coffee. If you want to make your coffee less bitter, you might want to invest in an Aeropress. Bitter coffee is usually a result of the water being too hot and/or in contact with the coffee grounds for too long. Those both cause too much tannic acid to leach into the coffee. The Aeropress solves that problem. Also, switch to a medium-dark or dark roast. That will let you get the all the darkness, flavor, and caffeine you want without having to use water that is so hot or without having it in contact with the grounds for too long. Doing these two things will make a world of difference. You probably won't need sweetener if you do that. (I drink my coffee black whenever I have it and it's delicious.)

The last thing you could look into if you really just can't stand black coffee is just buying caffeine pills and using that in place of coffee. I've used caffeine pills a lot and they're actually really convenient. The caffeine is the main component of coffee that stimulates your bowels, so it should suit that purpose as well as provide the normal energy boost.

Comment author: sixes_and_sevens 21 May 2015 12:42:24PM 3 points [-]

Who should I talk to in a group? I have a bunch of existing "social senses" for navigating this, but they're not very reliable. If a clear You-Should-Talk-To-This-Person sense went off whenever I encountered someone appropriate, that would be nice.

Comment author: TylerJay 24 May 2015 01:54:25AM 0 points [-]

What would you imagine the criteria would be?

Comment author: TylerJay 24 May 2015 01:39:29AM 2 points [-]

I had never heard of any of these except people putting magnets in their fingertips. Thanks for the post!

Minor typo I noticed:

"...and it is unique in that it is not implanted but instead." (instead what?)

Comment author: [deleted] 19 May 2015 07:21:23AM 4 points [-]

If you relax the the rules, I'll bring up David Attenborough. (I haven't read any other books than his Life on Earth, The Living Planet about the same subject (showing a zoomed-out view of the plant and animal life on the whole planet) but they absolutely dominated this type of market in the 1980's, I don't think there were even many alternatives published. Later on my interests turned away from zoology and botany.)

In response to comment by [deleted] on The Best Popular Books on Every Subject
Comment author: TylerJay 19 May 2015 11:09:04PM 3 points [-]

I'll second the recommendation to relax this rule. I think the ability to gauge the quality of a popular book is a lot more cross-domain than with textbooks. I've read good books and I've read bad books. I can tell pretty quickly if a book is bad, even if I'm relatively new to the subject area.

Also, I feel like a lot of people would tend to only read one or two pop books in a particular area. Any more knowledge beyond that often comes from the internet or a textbook or elsewhere. I mean, I can count on one hand the number of specific subjects about which I've read more than two actual published books, but I've spent hundreds of hours each reading about many more subjects than that.

And since pop books aren't typically comprehensive accounts of an entire field or subject, the most important things really are clarity, engagingness, and worth, and not necessarily completeness. If what is there is valuable, accurate, and it's presented well, then it's Good, even if it doesn't cover some things that are covered by other books.

Comment author: jacob_cannell 05 May 2015 07:40:38PM 1 point [-]

The info on wikipedia is ok. This MIRI interview with Mike Frank provides a good high level overview. Frank's various publications go into more details. "Physical Limits of Computing" by M Frank in particular is pretty good.

There have been a few discussions here on LW about some of the implications of reversible computing for the far future. Not all algorithms can take advantage of reversibility, but it looks like reversible simulations in general are feasible if they unwind time, and in particular monte carlo simulation algorithms could recycle entropy bits without unwinding time.

Comment author: TylerJay 06 May 2015 02:29:28AM 0 points [-]

Thanks, I'll check it out.

Comment author: jacob_cannell 30 April 2015 09:53:32PM *  10 points [-]

I've done some rather extensive investigations into the physical limits of computation and the future of Moore's Law style progress. Here's the general lowdown/predictions:

Moore's law for conventional computers is just running into some key new asymptotic limits. The big constraint is energy, which is entirely dominated now by interconnect (and to a lesser degree, passive leakage). For example, on a modern GPU it costs only about 10pJ for a flop, but it costs 30pJ just to read a float from a register, and it gows up orders of magnitude to read a float from local cache, remote cache, off-chip RAM, etc. The second constraint is the economics of shrinkage. We may already be hitting a wall around 20nm to 28nm. We can continue to make transistors smaller, but the cost per transistor is not going down so much (this effects logic transistors more than memory).

3D is the next big thing that can reduce interconnect distances, and using that plus optics for longer distances we can probably squeeze out another 10x to 30x improvement in ops/J. Nvidia and Intel are both going to use 3D RAM and optics in their next HPC parts. At that point we are getting close to the brain in terms of a limit of around 10^12 flops/J, which is a sort of natural limit for conventional computing. Low precision ops don't actually help much unless we are willing to run at much lower clockrates, because the energy cost comes from moving data (lower clock rates reduce latency pressure which reduces register/interconnect pressure). Alternate materials (graphene etc) are a red herring and not anywhere near as important as the interconnect issue, which is completely dominate at this point.

The next big improvement would be transitioning to a superconducting circuit basis which in theory allows for moving bits across the interconnect fabric for zero energy cost. That appears to be decades away, and it would probably only make sense for cloud/supercomputer deployment where large scale cryocooling is feasible. That could get us up to 10^14 flops/J, and up to 10^18 ops/J for low precision analog ops. This tech could beat the brain in terms of energy efficiency by a factor of about 100x to 1000x or so. At that point you are at the Landauer limit.

The next steps past that will probably involve reversible computing and quantum computing. Reversible computing can reduce the energy of some types of operations arbitrarily close to zero. Quantum computing can allow for huge speedups for some specific algorithms and computations. Both of these techs appear to also require cryocooling (as reversible computing without a superconducting interconnect just doesn't make much sense, and QC coherence works best near absolute zero). It is difficult to translate those concepts into a hard speedup figure, but it could eventually be very large - on the order of 10^6 or more.

For information storage density, DNA is close to the molecular packing limit of around ~1 bit / nm^3. A typical hard drive has a volume of around 30 cm^3, so using DNA level tech would result in roughly 10^21 bytes for an ultimate hard drive - so say 10^20 bytes to give room for the non-storage elements.

Comment author: TylerJay 04 May 2015 11:11:46PM 0 points [-]

Very informative. Thanks. I've heard reversible computing mentioned a few times, but have never looked into it. Any recommendations for a quick primer, or is wikipedia going to be good enough?

View more: Next