The key to avoiding rivalries is to introduce a new pole, which mediates your relationship to the antagonist. For me this pole is often Scripture. I renounce my claim to be thoroughly aligned with the pole of Scripture and refocus my attention on it, using it to mediate my relationship with the antagonistic party. Alternatively, I focus on a non-aggressive third party. You may notice that this same pattern is observed in the UK parliamentary system of the House of Commons, for instance. MPs don’t directly address each other: all of their interactions are mediated by and addressed to a non-aggressive, non-partisan third party – the Speaker. This serves to dampen antagonisms and decrease the tendency to fall into rivalry. In a conversation where such a ‘Speaker’ figure is lacking, you need mentally to establish and situate yourself relative to one. For me, the peaceful lurker or eavesdropper, Christ, or the Scripture can all serve in such a role. As I engage directly with this peaceful party and my relationship with the aggressive party becomes mediated by this party, I find it so much easier to retain my calm.
Having recently watched a few of these discussions/debates in the commons (watched via youtube) it is noticeable how the speaker is able to temper the mood and add a little levity.
There is one popular political youtube account called 'Incorrigible Delinquent' and he begins each of his uploads with the speaker quite humorously saying " You are an incorrigible delinquent! "
I single-handedly organised a half-day workshop for 80k, including doing the room bookings, the tech setup, the refreshments (couldn't buy them cheaply, I bought the crockery and food myself) and got feedback from the attendees of the sort "best catered event I've been to in 3 years of being in Oxford uni".
I've also completed my first term of university, and learned loads (of computer science, and also about my abilities in general).
I think the important facts about the mirror are that it is a not-quite-right AI that fails to give you what you want, and that it is Harry's future big problem - there was no way EY was going to let intelligence not be the final problem, and so it is this that lies in the future of the current story.
As a general rule, please ask a doctor before you ask the internet.
Well... I think the general rule is to ask your doctor before doing something that's different to what everyone else is doing. I think asking the internet is fine iff you ask the doctor before effecting a plan.
(Unless you have personal expertise or otherwise a well-evidenced model about why your doctor is inaccurate in this particular situation)
Luke is not qualified to shit on academic philosophy. He simply doesn't have the background or the overview. And it's a terrible idea for social reasons, it just makes people not take LW seriously. I would be happy to accept critiques of the philosophy establishment from e.g. Clark Glymour, not from Luke. There is a ton of value in philosophy you are leaving on the table if you shit on philosophy.
My other big annoyance is the "LW Bayesians" (who are similarly not qualified generally to have strong opinions about these issues, and instead should read stats/ML literature). Although I should say very sophisticated stats folks occasionally post here (but I don't count them among the "LW Bayesians" number, as they understand issues with Bayes very well).
Could you provide an object level counter argument please? A strong one would give me a lot more credence that Luke's work was not an accurate portrayal of academic philosophy.
(Three would be preferred)
(Object level might look like "philosophers are making useful progress by metric X" or "I expect philosophers' work to be very useful in area of science a because b" or "doing a PhD in philosophy has lots of value in the world for reasons p, q and r")
I don't write top level posts, but I took issue w/ Luke taking a shit on academic philosophy, for instance.
I don't see that the above post refutes any arguments Luke made about academic philosophy. What were the basics of your disagreements with his arguments?
I have been talking about this very issue for ages here on LW. "Rationalists" (the tribe, not the ideal platonic type) share a ton of EY's biases, including anti-academic sentiment.
Question: Did you make a post of this nature before?
There is no starvation in Western countries.
Well, there is some. A better way to put this is something like "there is no starvation left that could be treated by government programs."
Not perfectly true in Britain, as far as I can tell. Families are using food banks in masses, and one kid got scurvy as I recall.
I'm new to the subject, so I'm sorry if the following is obvious or completely wrong, but the comment left by Eliezer doesn't seem like something that would be written by a smart person who is trying to suppress information. I seriously doubt that EY didn't know about Streisand effect.
However the comment does seem like something that would be written by a smart person who is trying to create a meme or promote his blog.
In HPMOR characters give each other advice "to understand a plot, assume that what happened was the intended result, and look at who benefits." The idea of Roko's basilisk went viral and lesswrong.com got a lot of traffic from popular news sites(I'm assuming).
I also don't think that there's anything wrong with it, I'm just sayin'.
the comment left by Eliezer doesn't seem like something that would be written by a smart person who is trying to suppress information. I seriously doubt that EY didn't know about Streisand effect.
No worries about being wrong. But I definitely think you're overestimating Eliezer, and humanity in general. Thinking that calling someone an idiot for doing something stupid, and then deleting their post, would cause a massive blow up of epic proportions, is sometng you can really only predict in hindsight.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
And perhaps even the answer?
I think I'd prefer just having the answer, and then a guess-the-question thread.