Harlan 0:17:13
A really repeated theme is the inevitability thing. It’s pretty frustrating to hear, as someone who’s spending effort trying to help with this stuff in some kind of way that we can, and for someone to characterize your camp as thinking doom is inevitable. If I thought it was inevitable, I would just be relaxing. I wouldn’t bother doing anything about it. There’s some sense in which if it was inevitable, that would be worse, but it would also mean that we didn’t really have to do anything about it.Liron 0:17:42
Just to repeat your point in case viewers don’t get the connection: Dario is saying that doomerism is so unproductive because the Yudkowskis of the world — he doesn’t explicitly name Yudkowsky, but he’s basically saying our type — we think that we’re so doomed that we’re just fear-mongering, and it’s pointless.
Eliezer played directly into this with his Death With Dignity ""joke"". The past is past, but if you guys haven't yet said openly and plainly "fuck that, it was the product of personal exhaustion and being too cute by half, and whatever level of irony it was on it doesn't represent us", maybe that would be worth doing.
(It was just an April Fool's joke wouldn't count, because the post obviously had an element of "ha ha only serious". By design, the serious meaning was impossible to pin down, but to pretend the whole thing was simple first-level irony would be insulting.)
But you are implicitly assuming that you already know this process is in fact going to continue. So it's rather as if you asked Fred, and he told you yeah, there's always a big rush at the end of the day, few people get here as early as you.
I didn't mean to imply certainty, just uncertain expectation based on observation. Maybe I asked Fred, or the other customers, but I didn't receive any information about 'the end of the day' -- only confirmation of the trend so far.
(I'm not trying to be difficult for the sake of it, by the way! I just want to think these things through carefully and genuinely understand what you're saying, which requires pedantry sometimes.)
edit in response to your edit:
But if you know for a fact that all the customers are only 10 minutes old (including you) so decided to come here less than 10 minutes ago, then the only reasonable assumption is that there's a very fast population explosion going on, and you have absolutely no idea how much longer this is going to last, or how soon Fred will run out of chili and close the shop. In that situation, your predictability into the future is just short, and you just don't know what's going to happen after that — and clearly neither does Fred, so you can't just ask him.
I think I'm not quite understanding the distinction here. Why is there an important difference between "this trend is based on mechanisms of which I'm ignorant, such as the other customers' work hours or their expectations about chili quality over time" and "this trend is based on different mechanisms of which I'm also ignorant, i.e. birth rates and chili inventory"?
But that's not how I'm thinking of it in the first place -- I'm not positing any random selection process. I just don't see an immediately obvious flaw here:
And I still don't quite understand your response to this formulation of the argument. I think you're saying 'people who have ever lived and will ever live' is obviously the wrong reference class, but your arguments mostly target beliefs that I don't hold (and that I don't think I am implicitly assuming).
Sorry about the double reply, and it's been a while since I thought seriously about these topics, so I may well be making a silly mistake here, but --
There's a shop that uses a sequential ticketing system for queueing: each customer takes a numbered ticket when they enter, starting with ticket #1 for the first customer of the day. When I enter the shop, I know that it has been open for a couple of hours, but I have absolutely no idea when it closes (or even whether its 'day' is a mere 24 hours). I take my ticket and see that it's #20. I have also noticed that the customer flow seems to be increasing more than linearly, such that if the shop is open for another hour there will probably be another 20 customers, and if it's open for a few more hours there will be hundreds. Should I update towards the shop closing soon, on the grounds that otherwise my ticket number is atypically low? If so, wtf, and if not, what are the key differences between this and the doomsday argument?
"viewed from an achronous perspective, a low probability event has occurred — just like they always do right at the start of anything"
What's the 'low probability event'? I think this is the kind of framing I was disagreeing with in my original reply; there seems to be an implicit dualism here. So your reply isn't, from my perspective, addressing my reasons for finding anthropic reasoning difficult to completely dismiss.
If I roll a million-sided die, then no individual number rolled on it is more surprising than any other, not even a roll of 1 or 1,000,000 — UNLESS I'm playing an adversarial game where me rolling a 1 is uniquely good for my opponent. Then if I roll a 1 I should wonder if the die was fixed.
Yes, but if you haven't looked at the die yet, and the question of whether it's showing a number lower than 100 is relevant for some reason, you're going to strongly favour 'no'.
(That's not quite how I think about anthropic problems, though, because I don't think there's anything analogous to the dice roll -- hence my original complaint about smuggled dualism.)
I don't think you need completely specious reasoning to get to a kind of puzzling position, though. For us to be in the first <relatively small n>% of people, we don't need humanity to spread to the stars -- just to survive for a while longer without a population crash. And I think we do need some principled reason to be able to say "yes, 'I am in the first <relatively small n>% of people' is going to be false for the majority of people, but that's irrelevant to whether it's true or false for me".
Oh yeah, I should have made this clear in my reply to you (I'd written it in a different comment just a moment before):
I do find anthropic problems puzzling. What I find nonsensical are framings of those problems that treat indexical information as evidence -- e.g. in a scenario where person X (i.e. me) exists on both hypothesis A and hypothesis B, but hypothesis A implies that many more other people exist, I'm supposed to favour hypothesis B because I happen to be person X and that would be very unlikely given hypothesis A.
Yep (assuming I don't have a prior that heavily favours the red door case for some reason), but in this case I think I'm just applying ordinary bayesian reasoning to ordinary, non-identity-related evidence. The information I'm learning is not "I am this person", but "this person is still alive". That evidence is 99 times more likely in the green door case than the red door case, so I update strongly in favour of the green door case.
I think that's a reasonable interpretation of the actual serious content of the post, and my understanding of Eliezer's position basically matches yours. But the post starts like this:
And it sticks with the 'death with dignity' framing, talking about doubling our chances of survival from 0% to 0%, striving to earn 'dignity points' to take to our graves, and so on. It's also explicitly presented as a MIRI thing, not just a personal Eliezer thing.
Underneath all this, of course, he is talking about doing actually useful things to increase our survival odds (albeit from 'negligible' to 'still basically negligible'). But both the surface-level framing and the actual emotional content are drenched in despair and fatalism.
It's probably clear that I think the 'death with dignity' framing is bad and unhelpful, but obviously I can't be sure that Eliezer's post did more harm than good. You and Harlan evidently hate being cast as inevitable-doomers, though, and if this was an unfair and harmful move on the part of Amodei, I think communications like the Death With Dignity post are partly to blame for making it a viable one.