I would normally visit even a Score:-22 post with 200+ comments, because I've found that such cases indicate a particularly awful post may be worth opening just to hunt for a few of the most excellent clarifications or rebuttals it elicited.
A warning to others: my heuristic was wrong in this case. Few comments here even hint at what the hell is going on, and those suggested nothing more interesting than some extremely unlikely theological or parapsychological beliefs that Will might have latched onto and desired to "protect" us from. You could find more interesting and plausible basilisks in Lovecraft's stories or Stross' Laundry novels.
Now I will never get those 20 minutes of my life back, and if I happen to die exactly 20 minutes before Omega invents immortality, it is all my own stupid fault.
Not wasting the 20 minutes wouldn't have helped you survive till Omega invented immortality. (You didn't shorten your life in an absolute temporal sense, you just wasted some of the middle.)
WHY WOULDN'T YOU JUST ...
Must have had a good reason. It's a pity we mere mortals cannot fathom that reason, but we should at least recognise that it's the reasoning of God and so our being unable to fathom it is a fault with our meat brains, not with the reasoning.
Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information.
So? You say crazy (and wrong) shit a lot and have no credibility.
Whence our disagreement, if one exists?
Try explaining your reasoning and we might see. The whole "I have mysterious reasons why this crazy idea is true" thing is just annoying. (Whether done by you or Eliezer.)
Anyone finding themselves in the awkward position of wondering if he is a child among adults who may or may not be using innuendo? And that you think you understand a few of them, but aren't sure you do? To summarize my current state, Will Newsome is hitting some of my "take him seriously" heuristics pretty hard. At their center lies that he is taken far more seriously than most average posters think he should be taken, by some pretty big names who have been with this Eliezer centred rationality project since its start and have accrued quite a reputation for excellent judgement. He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.
I have several theories on this which I have constructed over the past few months but don't feel comfortable sharing right here, because I've stumbled on several caches of dangerous thinking. I have to keep squishing some ugh fields and bolstering others when exploring these ideas. Yet I also just can't come up and ask the right people to check my reasoning on any of them, their time is valuable and I'm not in their social circles anyway. I find myself blinking in confusion unsure if I'm bei...
He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.
To defend the repute of the visiting fellows program, please note that his crackpot score has skyrocketed since that time and he would almost certainly not have been accepted had he applied then as he is today.
Also, I think his crackpot score skyrocketed mostly after he left - so if it was something we did, it was a delayed effect.
I get the impression that he's often more concerned with signaling interestingness, intelligence, and contrarianism than figuring out what's true.
Note: I also get that impression from Michael Vassar. But I have lots of respect for the current Singularity Institute director.
I don't get that impression from Michael Vassar, possibly because I've talked with him in person. Asking repeatedly for examples makes it fairly possible to find out what he means.
I have no such hope with Will Newsome.
Will is pretty weird and I don't believe the way he thinks is, ya' know, normative. But I still find his writing to be extremely valuable relative to most Less Wrong commenters because for the most part Less Wrong commenters come in three different flavors: vanilla (what I would say if I weren't as smart or 3-4 years less educated), chocolate (what I would say now) and strawberry (what I would say if I were smarter or 3-4 years more educated). Will is lobster ice cream with rainbow jimmies. I will never think like him and I wouldn't want to. But I'm glad there is someone extending hypotheses as far as they will go and never looking back. I find novel explorations of hypothesis space to be both useful and interesting. He is pursuing a train of thought I don't have a lot of time for and no reason to prioritize. But I'm still looking forward to finding out where it ends up.
Will is like a musician on a hallucinogen. You wouldn't want to have his brain and you probably don't trust his judgment. But before he burns out at 27 he's gonna produce some really interesting ideas, some of which will simply be absurd but a few of which might have real staying power and influence a generation.
What is your position on Will Newsome?
I frequently find Will's contributions obscurantist.
In general, I find obscurantism at best tedious, and more often actively upsetting, so I mostly ignore it when I encounter it. Occasionally I engage with it, in a spirit of personal social training.
That said, I accept that one reader's obscurantism is another reader's appropriate level of indirection. If it's valuable to other people, great... the cost to me is low.
At this point the complaining about it by various frustrated people has cost me more than the behavior itself, by about an order of magnitude.
I didn't meet Will until April 2011, but most people who have been around longer seem to share Carl's opinion. For myself, I also find many of Will's contributions obscurantist, and I agree with John Maxwell that they seem to want to signal interestingness, intelligence, and and contrarianism. Finally: Will offered good, substantive feedback on two of my papers.
My sensation about Will Newsome is that of a celebrity I haven't heard of. Most of the comments that I notice authored by Will Newsome appear to be about Will Newsome, but I don't understand their content beyond that. They seem to attract a lot of attention.
There is this strange current of, well insight and reasonableness in his comment history and ideas.
I would be interested in reading some of these ideas, if you could point some out.
In addition to already mentioned obscurantist tendencies, he awards himself intellectual credit for "going meta," even when this does not lead to actually smarter behavior or better results.
He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.
What did he actually do, though?
For half the time, with Anna, I was an intern, not a Fellow. During that time I did a lot of intern stuff like driving people around. Part of my job was to befriend people and make the atmosphere more cohesive. Sometimes I planned dinners and trips but I wasn't very good at that. I was very charismatic and increasingly smart, and most importantly I was cheap. I was less cheap as a Fellow in the Berkeley apartments and accomplished less. I wrote and helped people occassionally. There weren't clear expectations for Fellows. Also people like Eliezer, who had power, never asked for any signs of accomplishment. Eliezer is also very bad at reading. Nonetheless I think I should have accomplished more somehow, e.g. gotten experience writing papers from scratch.
I believe I almost always turned down credit for contributions to papers, but I didn't make too many substantive contributions; I did a fair bit of editing, which I'm good at.
You could get a decent idea by looking at what the average Visiting Fellow did, then remember that I often couldn't remember things I did -- cognitive quirk -- and that I tried to avoid credit when possible at least half the time.
Part of my job was to befriend people and make the atmosphere more cohesive.
You were good at that, as I recall. As was (especially) Alicorn. Also, at the time I thought it was just super-cool that SI had its mundane tasks done by such brilliant people.
The aneurysm itself was at the edge of my thalamus. The resulting swelling caused damage kind of all over the place.
The functional damage at first was pretty severe, but I don't remember specifics; I mostly don't remember that week at all and much of what I do remember I'm fairly certain didn't actually happen. I spent it in an ICU. I come back to a coherent narrative about a week later; at that point, the bulk of the functional damage was general pain and fatigue, right-side hemiplegia (my right arm and leg were not quite paralyzed, but I lost control over them), mild aphasia which most often manifested as anomia (difficulty retrieving words) and occasionally in other ways (I remember losing the ability to conjugate sentences for a few hours; that was freaky), and (most significantly) a loss of short-term memory with all the associated knock-on effects to various kinds of cognitive processing.
There was also a lot of concern at various points that there may have been damage to my emotional centers. I never noticed any such effect, but, well, I wasn't necessarily the most reliable witness. Most memorably, this led to one doctor asking me if I my emotional state was at all unusual. ...
For what it's worth, it's already my opinion that you're completely insane and ought to have no credibility whatsoever. In fact I'm confused that anyone takes you seriously at all.
This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can't tell people about - a basilisk, some conspiracy-theory-type information, something. Being a relative newcomer unwilling to go through large numbers of his previous posts, I'd like to know if anyone who's seen him longer has any more information.
Also, this whole thing is absolutely hilarious to read.
I have a few ideas:
1) It's a "basilisk", i.e. an imaginary lovecraftian threat that doesn't even make sense outside of some highly particular and probably wrong belief system. (That's not my definition of basilisk, but it is what I think of such claims.)
2) Some mundane fact about the difficulty or danger of actually trying to save the world (in the specific sense of shaping a singularity) has made his blood run cold. It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.
3) I remember when I was 22 and realized (according to the plans I had at the time) that it might take me eight years to save the world. That was very daunting, because at the time it looked like it would be a joyless, stressful, solitary existence, for an unimaginably long period of time. And as it turned out, I didn't even get it done... Will could be fleeing the responsibilities of his "position" - I mean his existential position, which surely includes the perception that he has the potential to make a difference in a huge way.
ETA 4) He wants to create a barrier (a "credibility barrier") between himself and his former associates in SI, so as to develop his own thinking, because there's a systematic deficiency in their outlook and he must avoid the temptation of working within that paradigm.
Can you explain clearly why you have gone all crazy? Why do you have to drop these esoteric hints and do this stupid troll business?
My understanding is that you delved too deeply into simulation arguments and met Cthulu or something, had a religious experience and determined that there is a god or something and that the people who are in the know are all in the catholic church somewhere.
And then for some reason you can't just explain this clearly and lay out your reasons. Or maybe you've tried explaining it clearly, but that was before my time and now you assume that everyone either already knows what you are on about, or is interested enough to scour the internet for your posting.
???
If Will won't cooperate, can someone else explain the best model we have of his weirdness?
It may be relevant that Will has talked elsewhere about certain important physical phenomena being evasive, in the sense that their likelihood of occurring is significantly inversely proportional to whether someone is trying to prove or demonstrate them.
When I value my interactions with an evasive phenomenon (the beliefs of shy people, the social rules of Guess cultures, etc.), one consequence is often that I can't actually talk about my real reasons for things; everything has to be indirect and roundabout and sometimes actively deceptive.
I am generally happier when I don't value my interactions with evasive phenomena, but that's not always an option.
Upvoted for giving two examples of real evasive phenomena. I'd previously only encountered that idea in anti-epistemological contexts, wherein "the universe evades attempts to seek the truth about X" was always clearly a desperate after-the-fact attempt to justify "so despite attempts to seek the truth about X which keep appearing to contradict my claims, you should still believe my claims instead".
But I suppose it's just common sense that you can't properly investigate much psychology or sociology unless you avoid letting the subjects understand that they're being investigated. That's a huge difference from e.g. evasive cosmologies, in which investigating a subject without alerting Him is often presumed impossible.
I endorse maximizing the degree to which people consider my saying X is true to be evidence that I believe X is true.
I don't worry too much about the degree to which people consider my belief that X is true to be evidence that X is true. I expect that depends a whole lot on specifics of X.
I resent questions that wrap themselves in framings like "don't just consider the obvious points."
I endorse you having private conversations with the folks you consider worthy, rather than having obscure public ones like this. The rest of us might not have whatever attributes would allow us to penetrate the cloud of obfuscation and thereby receive your insights, but that doesn't mean we deserve to have our time wasted.
I have found Will_Newsome to be annoying for a long time now, because he does things like this and because he strikes me as irational. But he used to get upvoted, so I figured he just rubbed me the wrong way and didn't talk to/about him to avoid conflict. Now other people are downvoting him too. What changed?
Retracted because I have come to understand things that made the question moot, and because I no longer find Will as annoying as I did. I no longer think he's acting out of malice, though I still have serious doubts about his rationality.
If your goal is to lower your credibility, why do that in the context of talking about credibility?
Separate comment: Some of your remarks like this look almost like you are engaging in intellectual exhibitionism. This one fits into that and is a potential source of irritation.
Now to more substantially answer the question: people should pay attention to my idea and thoughts exactly how much they are credible or not. Trying to deliberately modify how credible I am in a general context will interfere with people make their most informed decisions about whether or not to listen to anything I have to say.
Great post: I like your style. The first observation to make is that individuals who make extraordinary contributions are often extremely eccentric, and also the quality of their pronouncements usually has high variance. So you've succeeded in increasing my probability estimate that you will say something very worthwhile, though maybe at the price of decreasing the (my) expected value of your average statement.
I don't get it. I'm guessing that Will edited the post? And it had something to do with the simulation argument?
Edit: I forgot to include, if someone who knows him better could explain will_newsome's motivations here, that would be appreciated. (I enjoy internet drama).
Because he's the hero LessWrong deserves, but not the one it needs right now. So we'll hunt him. Because he can take it. Because he's not our hero. He's a silent guardian, a watchful protector. A dark knight.
Credibility. Should you maximize it, or minimize it? Have I made an error?
Depends entirely on your goals.
Most people drop out before doctorates; it's something like 97-99% of the US population. And getting a doctorate in many fields is a terrible idea these days: I looked very hard at continuing on for a doctorate in philosophy, and concluded that even if the grad school offered a fullride, it was still probably a bad idea and almost anything was better.
seems a distinguishing mark of the core SIAI community
Your examples being Will and Eliezer? I didn't realize the core SIAI community was so small.
...Is SIAI to serve as poster boy for the libertarian cause
Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.
Makes allusions in that direction.
Paranoid schizophrenia (the most likely form because Will is high functioning) is incurable--although partial remissions often occur.
Incurable but fortunately treatable to a significant degree---especially the highly visible paranoid side of things. Unfortunately those with the negative symptoms are pretty much just screwed.
...Will often posts in the obscure, mysterious fashion often typical of intelligent paranoid s
Will,
Please consider undergoing neurofeedback therapy. I'm doing it and I believe there is a reasonable chance it would yield you (far more than the average human) a high benefit.
Let me take a guess:
You believe in some form of Christianity and enjoy discussing it on LessWrong but think that your comments harm the perception of Christianity on LessWrong due to readers not having privileged information.
You believe you can mitigate this negative effect by lowering your own reputation.
This is a poll. Is Will Newsome sufficiently noisy (in both senses of the word) that mod intervention is called for? Permalink to karma sink.
This poll is BROKEN! Abandon it and do it properly!
The Karma sink comment is brilliant (and harmless fun) but the extra comments on the "Yes" and "No" answers don't just bias perception they outright make the poll unanswerable in the current form.
No. He's entertaining even when at his trolliest.
I would vote for a plain "No." but he is most decidedly not entertaining even when at his trolliest. He is boring, repetitive and banal when at his trolliest. It shouldn't be assumed that people who oppose mod influence believe Will's trolliest crap is entertaining - or vice versa.
Because Will had explicitly threatened to use sockpuppets for various purposes, he could have used them to manipulate the poll, too. Therefore I vote by means of this comment. The vote: ban him. Reasons:
By the way, this is the first time I endorse banning someone from an internet discussion forum.
It depends what mod intervention consists of. If you mean banning him, I do not think that is called for at this time. If you mean telling him to stop his antics and warning him that he's headed towards a ban if he continues, that sounds like a good idea. Posts (and comments) that are intentionally obscure, made merely for one's own entertainment, or otherwise trollish are not welcome here, and since the community's downvotes and critical comments haven't gotten through to him it would be good to have a mod convey that message.
CLARIFICATION: I do not have ACCOUNT DELETION powers. As far as I know, those powers don't exist. I have comment/post banning powers and post editing powers. If I started moderating Will, I would be banning excess downvoted comments, not shooing him away wholesale.
I'm in favor of mod intervention lest anyone else waste as much time as I have scratching their head trying to figure out what this thread is about.
Yes, but moderation is about making the site what it should be for a variety of people, not just me and people who are unshy enough to talk to me directly, or just mods. So I want information. I wield the ban button, but I'm not going to use it as a site customization tool for Alicorn in particular.
Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.
I'm schizotypal I suppose, but not schizophrenic given the standard definition. I don't think I have any trouble interpreting mundane coincidences as mundane.
On Will_Newsome's profile, one sees a link to his blog, Computational Theology, where it is possible to have an idea of how he thinks, or what kind of reasoning is behind this whole stuff. I wasn't impressed, although I would not be able to do better myself (at least at this point).
The comments on this post have significantly influenced my opinion on a number of people. Thanks, Will.
Someone who proclaims to openly sacrifice their credibility, in a mysterious way, while making a lot of vague suggestions, can succeed to cause people to actually listen and speculate if there might actually be more to it than meets the eye.
Something else that has the same effect is censorship and secrecy.
What also works well is to claim that there exists some technical research but that it has to be kept secret.
What all of it has in common is that there is nothing but a balloon full of hot air.
Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?
Disagreement about what? What's exactly your opinion on credibility?
Whence our disagreement, if one exists?
Credibility moves, like status moves, cannot be self-recognising and still effective. I believe this, you don't, there's our disagreement.
Thank you for your cooperation and understanding. Don't worry, there won't be future posts like this, so you don't have to delete my LessWrong account, and anyway I could make another, and another.
But since you've dared to read this far:
Credibility. Should you maximize it, or minimize it? Have I made an error?
Discuss.
Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?