e^(pi*i) = -1
Anything else: lame.
e^(pi*i) = -1
Anything else: lame.
Uh, how is e^(pi*i) = 1 lame?
You may as well argue for base four arithmetic.
Huh. Would that actually be easier? I always figured ten fingers...
I don't see myself with ten fingers as a posthuman anyway.
Has anybody else wished that the value of the symbol, pi, was doubled? It becomes far more intuitive this way--this may even affect uptake of trigonometry in school. This rates up with declaring the electron's charge as negative rather than positive.
More to the point, 1337 is the popular convention. If the numbers were not meant as a reference to the convention then they would be more silly than they already are.
Well, yes, it is meant to mean "elite". There are several variations of it, though.
Humor will be more relevant post-singularity
That is a significant claim. Not least because it implies that given that a singularity occurs it will be a singularity that doesn't suck. (Better as a goal than it is as a prediction.)
Good point. I admit to being confused by the use of "suck" and singularity. But certainly--a dystopian singularity will leave perhaps nothing to laugh at.
Voted down for spelling 1337 wrong in your username.
Perhaps it is wrong to use the same digit to refer to separate letters. 133+ is OK, 1337 is OK, but 7337 seems to break that criteria. Point noted.
That's a fair question.
I know that a number of people here are programmers, for one thing. For another, I deliberately aimed very low with the anecdote about "clueless users", so that even LW readers who are not programmers would feel, with respect to someone that clueless, the same way I feel with respect to my dad, whose confusions about computers are quite a bit more sophisticated.
The implication isn't that the reader is a computer expert, but that they have some area of expertise in which they feel as clued in as I feel in computers, and I'm inviting them to identify with me when I tell the anecdote about my dad.
Would it make more sense for you if I amended the sentence starting "If you are a software developer", so that it read "I'm a software developer, so for me that tends to be..."
It looks like you say:
"To that end, I want to start off by considering some comfortable examples, where someone else is the butt of the joke, and then consider examples which might make you more uneasy."
This is before you mention software development. Software developers are over-represented in general on the internet, and I sort of glazed over and wondered if I was back on Reddit/digg/slashdot/... .
I'm curious--I was under the impression that lesswrong.com was a community dedicated to rationality. Some may be like your father--good at math, but bad at navigating computers. Why did you assume that people on this site are computer oriented?
Yeah, It will be recorded. I'll add a link to the post when the video is up.
How was it? Did you manage to extend your talk with some questions?
Would love to see the video. It would be great to see what can be done.
View more: Next
Pi is well-defined, yes, and that's not going to change. But some notation is better than others. It would be better notation if we had a symbol that meant 2pi, and not necessarily any symbol that meant pi, because the number 2pi is just usually more relevant. There's all sorts of notation we have that is perfectly well-defined, purely mathematical, not dependent on any system of units, but is not optimal for making things intuitive and easy to read, write and generally process. The gamma function is another good example.
I really fail to see why metric vs. english units is much more serious; neither metric nor english units is particularly suggestive of anything these days. Neither is more natural. The quantities being measured with them aren't going to be nice clean numbers like pi/2, they're going to be messy no matter what system of units you measure them with.
What about the gamma function is bad? Is it the offset relation to the factorial?