Plasma Ballin'

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I was going to comment this as well. I think it probably is the case that waste-efficiency and safety of nuclear reactors is positively correlated in the real world for that exact reason. Of course, reasoning to this point by, "Reactor A produces less waste than Reactor B. Therefore, Reactor A is better than Reactor B. Therefore, Reactor A is less likely to melt down than Reactor B," is invalid, so the main point of EY's post still stands. The correct reasoning is more like, "Technology improves and reactor design is refined over time. This occurs fast enough that reactors built later are likely to be better than earlier ones on all fronts. If Reactor A is more waste-efficient than Reactor B, it was probably built later and is therefore also likely to be safer and more cost-effective." Unlike the naive, "A is better than B" model, this one no longer predicts that A will be safer than B if I get the additional piece of information that A and B were built in the same year. Then I predict the opposite based on trade-offs that probably had to occur.

I don't think it changes the conclusion by much, but isn't, "The q, which doesn't appear in that paper, is a constant, so has M_q = 1," incorrect? All of the valves we're looking at are constants, but we don't know what the constants are, so we have a nonzero variance, leading g to M_q > 1. We certainly don't know the exact value of q, so it shouldn't have M_q = 1.

I don't know of any real-life analogue, though I would bet that some exist. I recall that there is a fictional example in "The Adventures of Huckleberry Finn", though: Huck believes that helping Jim escape slavery would be stealing, since that is what he had been taught living in the antebellum South, and he concludes that he will go to Hell for doing this. But he decides to help Jim anyway, even if it's the "wrong" thing to do.

Seems relevant in the wake of the FTX scandal. I've seen people blaming effective altruism for the scandal, as if it FTX's fraudulent practices prove that the philosophy of giving to charities that demonstrably do the most good is flawed. Even if the entire EA movement is cult-like and misguided, that doesn't mean that the principle it's based on is wrong. I think the modern EA movement is misguided to some extent, but only because they have misjudged which causes are the most effective, and this shouldn't stop anyone else from donating to causes that they believe are more effective.

I actually think it is possible for someone's beliefs to anti-correlate with reality without being smart enough to know what really is true just to reverse it. I can think of at least three ways this could happen, beyond extremely unlikely coincidences. The first two are that a person could be systematically deceived by someone else, until they have more false beliefs then true ones, and the second is that systematic cognitive biases could reliably distort their beliefs. The third is the most interesting one, though: If someone has a belief that many of their other beliefs depend on, and that belief is wrong, it could lead to all of those other beliefs being wrong as well. There are plenty of people who base a large portion of their beliefs on a single belief or cluster of beliefs, the most obvious example being the devoutly religious, especially if they belong to a cult or fundamentalist group. Basically, since beliefs are not independent, people can have large sets of connected beliefs that stand or fall together. Of course, this still wouldn't affect the probability that any of their beliefs not connected to those clusters are true, so it doesn't really change the conclusion of this essay by much, but I think it is interesting nonetheless. At the very least, it is a warning against having too many beliefs that all depend on a single idea.