Doesn't this all imply that the set of meaningfully different laws is finite? Also, what if there is a smallest possible level of resolution?
It doesn't, because the reals are infinite in two ways: any given interval is infinite, but the number of intervals is also infinite.
Also this only applies to changing the constants but keeping the general structure the same; you can also create further laws by changing the structure of the laws itself. There are a lot of degrees of freedom.
Paperclip maximizers are a specific absurdity with probability near zero, and I find that discussing them sucks insight out of the discussion.
Strongly disagree, and in general it's dangerous to dismiss an argument by asserting that it's stupid and that merely discussing it is bad.
This full set [of all possible laws of physics under which sentient life can exist] is infinite,
How can you be so sure of this?
This full set [of all possible laws of physics under which sentient life can exist] is infinite,
How can you be so sure of this?
Presumably there is some level of resolution at which changes to the fundamental constants no longer have appreciable effects. Maybe it's the thousandth decimal place; maybe it's the googolth decimal place; but it seems extremely unlikely to me that there isn't such a level. Given this assumption, the set of possible laws is clearly infinite.
Is there, or will there be, an RSS feed for this? I didn't see one anywhere.
Minor nit, but I don't think anyone has ever died from being struck by thunder.
A question that has been asked before, and so may be stupid: What concrete examples are there of gains from CfAR training (or self-study based on LessWrong)? These would have to come in the form of very specific examples, preferably quantitative.
E.g. "I was $100,000 in debt and unemployed for 2 years, and now I have employment earning twice what I ever have before and am out of debt."
"I never had a relationship that lasted more than 2 months, but now am happily married."
"My grade point average went up from 2.2 to 3.8"
"After struggling to diet and exercise for years, I finally got on track and am now in the best shape of my life."
etc.
Within 3 months of attending a CFAR workshop I had left my job for one that I preferred and that paid 15% more. Within 6 months I had started exercising daily (previously once every few weeks), waking up consistently at 6am (previously varied anywhere from 8-10am), and eating significantly healthier (I also started eating vegetarian meals 2-3 times a week, previously 0). Independent of any of these concrete behaviors, I now have a very strong belief that I can intentionally construct/change my life and behavior in ways that will actually work.
Post hoc ergo propter hoc, etc. I have a fairly strong belief that attending a CFAR workshop and interacting with the alumni community has been at least partially causal in me improving my life, but I don't think any of what I've said constitutes good evidence of that claim.
Well, it's a bit worrying if the "main cluster" of the LessWrong/rationalist/MIRI nebula, i.e. the Bay Area rationalists is propagating crackpotty ideas, or even just as susceptible to them as the general Bay Area population. I don't know if it's actually the case though. Maybe it's more of a problem for the Effective Altruism movement (i.e. it attracts both rationalists and crackpots that share their ideas, but there's no overlap between them).
There have been numerous critiques of Connection Theory already, and I encounter people disavowing it with much more frequency than people endorsing it, in both the rationalist and EA communities. So, I don't think we have anything to worry about in that direction. I'm more worried by the zeal with which people criticize it, given that Leverage rarely seems to mention it, all of the online material about it is quite dated, and many of the people whose criticism of it I question don't seem to actually know hardly anything about it.
To be extra clear: I'm not a proponent of CT; I'm very skeptical of it. It's just distressing to me how quick the LW community is to politicize the issue.
Would you consider a post criticizing homoeopathy with a similar language a "political attack"?
I would. I would think that such a post was quite silly, in the context of being posted to LessWrong. I would hope that, if there were any question about the subject, someone would simply post a list of evidence or neutral well-reasoned arguments. Homeopathy is easily enough dispatched with "there is no explanation for this that conforms to our understanding of physical laws, and there is no evidence for it, therefore it is very unlikely to be true." Bringing political speech and in-group/out-group dynamics into it is detrimental to the conversation.
I have to say that I am extremely disappointed in the response to this post. I have no stake on either side of the issue, but if this is the best we can do then I can't tell that the Sequences have had any effect at all.
This post is a political attack that doesn't add anything new or substantive to a discussion of Leverage or of Connection Theory. Much of the language used in the post is used primarily for negative connotational value, which is characteristic of political speech. Politics is hard mode. A discussion about Leverage or Connection Theory may be valuable, but this post is not a good way to start one.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
This is a different but related project by someone loosely affiliated with the Bay Area community, definitely worth talking to if you want to go further with this.