Just want to echo: thanks for doing this. This is awesome.
Your post got me thinking about some stuff I've been dealing with, and I think helped me make some progress on it almost instantly. I don't think the mechanisms are quite the same, but thinking about your experience induced me to have useful realizations about myself. I'll share in case it's useful to someone else:
It sounds like your self-concept issue was rooted in "having a negative model of yourself deeply ingrained", which induced neuroses in your thoughts/behaviors that attempted to compensate for it / search around for ways to convince your...
Yeah, that's the exact same conclusion I'm pushing here. That and "you should feel equipped to come to this conclusion even if you're not an expert." I know.. several people, and have seen more online (including in this comment section) who seem okay with "yeah, it's negative one twelfth, isn't that crazy?" and I think that's really not ok.
My friend who's in a physics grad program promised me that it does eventually show up in QFT, and apparently also in nonlinear dynamics. Good enough for me, for now.
The assumed opinions I'm talking about are not the substance of your argument; they're things like "I think that most of these reactions are not only stupid, but they also show that American liberals inhabit a parallel universe", and what is implied in the use of phrases like 'completely hysterical', 'ridiculous', 'nonsensical', 'proposterous', 'deranged', 'which any moron could have done', 'basically a religion', 'disconnected from reality', 'save the pillar of their faith', etc. You're clearly not interested in discussion of your condemnation ...
It's true that politics is generally discouraged around here. But, also -- I'm the person who commented negatively on your post, and I want to point out that it wasn't going to be well-received, even if politics was okay here. You wrote in a style that assumed a lot of opinions are held by your readers, without justification, and that tends to alienate anyone who disagrees with you. Moreover, you write about those opinions as if they are not just true but obviously true, which tends to additionally infuriate anyone who disagrees with you. So I think your p...
With an opening like
The idea that liberal elites are disconnected from reality has been a major theme of post-election reflections. Nowhere is this more obvious than in academia, where Trump’s victory resulted in completely hysterical reactions.
It's clear that this is written for people who already believe these things. The rest, unsurprisingly, confirms that. I thought LW tried to avoid politics? And, especially, pointless politically-motivated negativity. "liberal-bashing" isn't very interesting, and I don't think there's a point in linki...
It doesn't count in the discussions of coloring graphs, such as in the four color map theorem, and that's the kind of math this is most similar to. So you really need to specify.
Are you just wondering what 'pushing' means in this context? Or speculating about the existence of anti-gravity?
I'm pretty sure that this is just interpreting as region of low density as 'pushing' because it 'pulls less' than a region of average density would.
This is similar to how electron 'holes' in a metal's atomic lattice can be treated as positive particles.
Don't you think there's some value of doing a more controlled study of it?
No, because it's not a possibility that when you thought you were doing math in the reals this whole time, you were actually doing math in the surreals. Using a system other than the normal one would need to be stated explicitly.
You had written
"I really want a group of people that I can trust to be truth seeking and also truth saying. LW had an emphasis for that and rationalists seem to be slipping away from it with "rationality is about winning"."
And I'm saying that LW is about rationality, and rationality is how you optimally do things, and truth-seeking is a side effect. And the truth-seeking stuff in the rationality community that you like is because "a community about rationality" is naturally compelled to participate in truth-seeking, because it...
Interleaving isn't really the right way of getting consistent results for summations. Formal methods like Cesaro Summation are the better way of doing things, and give the result 1/2 for that series. There's a pretty good overview on this wiki article about summing 1-2+3-4.. .
I know about Cesaro and Abel summation and vaguely understand analytic continuation and regularization techniques for deriving results from divergent series. And.. I strongly disagree with that last sentence. As, well, explained with this post, I think statements like "1+2+3+...=-1/12" are criminally deceptive.
Valid statements that eliminate the confusion are things like "1+2+3...=-1/12+O(infinity)", or "analytic_continuation(1+2+3+)=-1/12", or "1#2#3=-1/12", where # is a different operation that implies "addit...
Well. We should probably distinguish between what rationality is about and what LW/rationalist communities are about. Rationality-the-mental-art is, I think, about "making optimal plays" at whatever you're doing, which leads to winning (I prefer the former because it avoids the problem where you might only win probabilistically, which may mean you never actually win). But the community is definitely not based around "we're each trying to win on our own and maximize our own utility functions" or anything like that. The community is inter...
Interesting, I've never looked closely at these infinitely-long numbers before.
In the first example, It looks like you've described the infinite series 9(1+10+10^2+10^3...), which if you ignore radii of convergence is 9*1/(1-x) evaluated at x=10, giving 9/-9=-1. I assume without checking that this is what Cesaro or Abel summation of that series would give (which is the technical way to get to 1+2+3+4..=-1/12 though I still reject that that's a fair use of the symbols '+' and '=' without qualification).
Re the second part: interesting. Nothing is immediately coming to mind.
Fixed the typo. Also changed the argument there entirely: I think that the easy reason to assume we're talking about real numbers instead of rationals is just that that's the default when doing math, not because 0.999... looks like a real number due to the decimal representation. Skips the problem entirely.
Well - I'm still getting the impression that you're misunderstanding the point of the virtues, so I'm not sure I agree that we're talking past each other. The virtues, as I read them, are describing characteristics of rational thought. It is not required that rational thinkers appear to behave rationally to others, or act according to the virtues, at all. Lying very well may be a good, or the best, play in a social situation.
Appearing rational may be a good play. Demonstrating rationality can cause people to trust you and your ability to make good decision...
Ah, of course, my mistake. I was trying to hand-wave an argument that we should be looking at reals instead of rationals (which isn't inherently true once you already know that 0.999...=1, but seems like it should be before you've determined that). I foolishly didn't think twice about what I had written to see if it made sense.
I still think it's true that "0.999..." compels you to look at the definition of real numbers, not rationals. Just need to figure out a plausible sounding justification for that.
This reminds me of an effect I've noticed a few times:
I observe that in debates, having two (or more) arguments for your case is usually less effective than having one.
For example, if you're trying to convince someone (for some reason) that "yes, global warming is real", you might have two arguments that seem good to you:
But if you actually cite both of these arguments, you start to sound weaker than if you picked one and stu...
Thanks! Validation really, really helps with making more. I hope to, though I'm not sure I can churn them out that quickly since I have to wait for an idea to come along.
That's a good approach for things where there's a 'real answer' out there somewhere. I think it's often the case that there's no good answer. There might be a group of people saying they found a solution, and since there no other solutions they think you should fully buy into theirs and accept whatever nonsensities come packaged with it (for instance, consider how you'd approach the 1+2+3+4+5..=-1/12 proof if you were doing math before calculus existed). I think it's very important to reject seemingly good answers on their own merits even if there isn't a better answer around. indeed, this is one of the processes that can lead to finding a better answer.
Well, Numberphile says they appear all over physics. That's not actually true. They appear in like two places in physics, both deep inside QFT, mentioned here.
QFT uses a concept called renormalization to drop infinities all over the place, but it's quite sketchy and will probably not appear in whatever final form physics takes when humanity figures it all out. It's advanced stuff and not, imo, worth trying to understand as a layperson (unless you already know quantum mechanics in which case knock yourself out).
If it helps -- I don't understand what the second half (from the part about Youtube videos onwards) has to do with fighting or optimizing styles.
I also didn't glean what an 'optimizing style' is, so I think the point is lost on me.
Regardless of your laundry list of reasons not to edit your post, you should read "I'm confused about what you wrote" comments, if you believe them to be legitimate criticisms, as a sign that your personal filter on your own writing is not catching certain problems, so you might be highly benefitted by taking it as an o...
tbh I haven't figured out how to use Arbital yet. I think it's lacking in the UX department. I wish the front page discriminated by categories or something, because I find myself not caring about anything I'm reading.
I think you've subtly misinterpreted each of the virtues (not that I think in terms the twelve-virtue list is special; they're just twelve good aspects of rational thought).
The virtues apply to your mental process for parsing and making predictions about the world. They don't exactly match the real-world usages of these terms.
Consider these in the context of winning a game. Let's talk about a real-world game with social elements, to make it harder, rather than something like chess. How about "Suppose you're a small business owner. How do you beat the ...
I strongly encourage you to do it. I'm typing up a post right now specifically encouraging people to summarize fields in LW discussion threads as a useful way to contribute, and I think I'm just gonna use this as an example since it's on my mind..
This is helpful, thanks.
In the "Rationality is about winning" train of thought, I'd guess that anything materially different in post-rationality (tm) would be eventually subsumed into the 'rationality' umbrella if it works, since it would, well, win. The model of it as a social divide seems immediately appealing for making sense of the ecosystem.
Any chance you could be bothered to write a post explaining what you're talking about, at a survey/overview level?
I disagree. The point is that most comments are comments we want to have around, and so we should encourage them. I know that personally I'm unmotivated to comment, and especially to put more than a couple minutes of work into a comment, because I get the impression that no one cares if I do or not.
One general suggestion to everyone: upvote more.
It feels a lot more fun to be involved in this kind of community when participating is rewarded. I think we'd benefit by upvoting good posts and comments a lot more often (based on the "do I want this around?" metric, not the "do I agree with this poster" metric). I know that personally, if I got 10-20 upvotes on a decent post or comment, I'd be a lot more motivated to put more time in to make a good one.
I think the appropriate behavior is, when reading a comment thread, to upvote almost e...
I only heard this phrase "postrationality" for the first time a few days ago, maybe because I don't keep up with the rationality-blog-metaverse that well, and I really don't understand it.
All the descriptions I come across when I look for them seem to describe "rationality, plus being willing to talk about human experience too", but I thought the LW-sphere was already into talking about human experience and whatnot. So is it just "we're not comfortable talking about human experience on in the rationalist sphere so we made our own s...
Why do you think there is nothing wrong with your delivery? Multiple people have told you that there was. Is that not evidence that there was? Especially because it's the community's opinions that count, not yours?
Rude refers to your method of communicating, not the content of what you said. "I mean that you do not know of the subject, and I do. I can explain it, and you might understand" is very rude, and pointlessly so.
Why do you think you know how much game theory I know?
edit: I edited out the "Is English your first language" bit. That was unnecessarily rude.
I'm not trying to welcome you, I'm trying to explain why your posts were moved to drafts against your will.
I'm not arguing with or talking about Nash's theory. I'm telling you that your posts are low quality and you need to fix that if you want a good response.
My point in the last paragraph is that you are treating everyone like dirt and coming across as repulsive and egotistical.
"You are incorrect" was referring to "No, you can't give me feedback.". Yes, we can. If you're not receptive to feedback, you should probably leave this site....
How could you possibly know what a random person knows of? Why are you so rude?
Re this post: http://lesswrong.com/lw/ogp/a_proposal_for_a_simpler_solution_to_all_these/
You wrote something provocative but provided no arguments or explanations or examples or anything. That's why it's low-quality. It doesn't matter how good your idea is if you don't bother to do any legwork to show anyone else. I for one have no why your idea would and don't care to do work to figure it out because the only reason I have to do work is that you said so.
Also, you might want to tackle something more concrete than "all these difficult observations and ...
I'm not asking for people not to talk about problems they have. I'm just criticizing the specifically extra-insensitive way of doing it in the comment I replied to. There are nicer, less intentionally hurtful ways to say the exact same thing.
While I think it's fine to call someone out by name if nothing else is working, I think the way you're doing it is unnecessarily antagonistic and seemingly intentionally spiteful or at least utterly un-empathetic, and what you're doing can (and in my opinion ought to) be done empathetically, for cohesion and not hurting people excessively and whatnot.
Giving an excuse about why it's okay that you, specifically, are doing it, and declaring that you're "naming and shaming" on purpose, makes it worse. It's already shaming the person without saying th...
No, markets only work for services whose costs are high enough to participants to care and model their behavior accordingly. In my observation, specifically, these people behave this way for reasons other than their personal comfort, and the costs aren't high enough (or they're not aware that they're high enough) to influence their behavior.
The 'reason to speculate' is that it's interesting to talk about it. That's all.
I think you get more of that in Texas and the southeast. It (by my observation - very much a stereotype) correlates with driving big trucks, eating big meals, liking steak dinners and soda and big desserts, obesity, not caring about the environment, and taking strong unwavering opinions on things. And with conservatism, but not exclusively.
I distinctly remember driving in my high school band director's car once, maybe a decade ago, and he was blasting the AC at max when it maybe needed to be on the lowest setting, tops -- it seemed to reflect a mindset tha...
Is there an index of everything I ought to read to be 'up-to-date' in the rationalist community? I keep finding new stuff: new ancient LW posts, new bloggers, etc. There's also this on the Wiki, which is useful (but is curiously not what you find when you click on 'all pages' on the wiki; that instead gets a page with 3 articles on it?). But I think that list is probably more than I want - a lot of it is filler/fluff (though I plan to at least skim everything, if I don't burn out).
I just want to be able to make sure, if I try to post something I think is new on here, that it hasn't been talked to death about already.
Thanks, this is useful.
I've been thinking about doing this - I'm trying to learn math (real/complex analysis, abstract algebra) for 'long term retention' as I'm not really using it right now but want to get ahead of learning it later, and struggling with retention of concepts and core proofs.
Do you think it's going to be useful to share decks for this purpose? I feel like there are many benefits to making my own cards and adding them as I progress through the material, and being handed a deck for the whole subject at once will be overwhelming.
Here's an opinion on this that I haven't seen voiced yet:
I have trouble being excited about the 'rationalist community' because it turns out it's actually the "AI doomsday cult", and never seems to get very far away from that.
As a person who thinks we have far bigger fish to fry than impending existential AI risk - like problems with how irrational most people everywhere (including us) are, or how divorced rationality is from our political discussions / collective decision making progress, or how climate change or war might destroy our relatively...
Being a member of this community seems to requiring buying into the AI-thing, and I don't so I don't feel like a member.
I don't think that it's true that you need to buy into the AI-thing to be a member of the community, and so I think that it seems that way is a problem.
But I think you do need to be able to buy into the non-weirdness of caring about the AI-thing, and that we may need to be somewhat explicit about the difference between those two things.
[This isn't specific to AI; I think this holds for lots of positions. Cryonics is probably an easy one to point at that disproportionately many LWers endorse but is seen as deeply weird by society at large.]
This is interesting, but I don't understand your questions at end. What simulation theory are you talking about?
By the way, one of your links is broken and should be http://file.scirp.org/pdf/OPJ_2016063013301299.pdf .
Keep in mind that there is a significant seasonal variation in emissions from the sun, such as neutrinos which can easily penetrate into any experimental apparatus on earth. This is simple to rationalize: the sun emits massive numbers of neutrinos, which pass through areas at a shallower angle in the winter and thus have lower flux.
By far the...
I double majored in physics and computer science as an undergrad at a pretty good school.
My observation is this:
The computer science students had a much easier time getting jobs, because getting a job with mediocre software engineering experience is pretty easy (in the US in today's market). I did this with undeservedly little effort.
The physics students were, in general, completely capable of putting in 6 months of work to become as employable as the computer science students. I have several friends who majored in things completely non-technical, but by s...
I watched #3 again and I'm pretty convinced you're right. It is strange, seeing it totally differently once I have a theory to match.
I strongly disagree with the approaches usually recommended online, which involve some mixture of sites like CodeAcademy and looking into open source projects and lots of other hard-to-motivate things. Maybe my brain works differently, but those never appealed to me. I can't do book learning and I can't make myself up and dedicate to something I'm not drawn to already. If you're similar, try this instead:
Now, when I say "try"... new programmers often envision just sitting down and...
Yeah, it can definitely be done for cheaper. In my case going through college and such I got new frames every year or two (between breaking them or starting to hate the style..). The bigger expense was contacts, which we either didn't have insurance for or it didn't cover, coming out to 100-150/year depending on how often I lost or damaged them.
I don't think this is quite right. In my experience, the sensation that someone is higher status than me induces a desperate desire to be validated by them, abstractly. It's not the same as 'gratitude' or anything like that; it's the desire to associate with them in order to acquire a specific pleasurable sensation -- one of group membership, acceptance, and worth.