The example is a bit unfortunate - in this hypothetical world, smoking doesn't cause cancer at all (at least in the first example).
It refers back to the tobacco companies' lies that the correlation between smoking and cancer could be exaplained by a third factor (ie a gene).
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I agree with both you and Kelly most of the time, you more than him. I did think this part required a nitpick:
If I became superintelligent tomorrow, I probably wouldn't significantly change the world. Not on a Singularity scale, not right away, and not just because I could. Would you? My point there is that you can't assume that because the first superintelligence can construct nanobots and take over the world, it therefore will.