All of Leonhart's Comments + Replies

Apologies for no response; I vaguely assumed I would get a notification if anyone commented. I think we'll start in the Shakespeare's Head as it's a bit cloudy. There will be a sign up. Otherwise, climb the nerd gradient until you find us; we're usually in the back third past the bar.

If you intend to try again in the current open thread, feel free to transfer the examples.

Trying to clarify my intuitions re. B:

Consider Paul Atreides undergoing the gom jabbar; he will die unless he keeps his hand in the box. Given that he knows this, I count his success as a freely willed action; if (counterfactually) the pain had been sufficient to overcome him, withdrawing his hand would not have been freely willed, because it is counter to his consciously endorsed values (and, in this case, not subtle or confused values).

However, if (also counterfac... (read more)

Pretty sure I'm misparsing you somehow, but here are some things I might consider nonfree action :

A) an action is rewarded with a heroin fix; the actor is in withdrawal

B) an action will relieve extreme and urgent pain

C) an action is demanded by reflex (e.g. withdrawal from heat)

D) an action is demanded by an irresistably salient emotional appeal that the agent does not reflectively endorse (release the country-slaying neurotoxin, or I shall shoot your child)

0Shmi
I think these are very good examples, I would agree with C), disagree with D), require clarification on B) and have no strong opinion on A). Others might have different opinions. I further think that without amassing a wealth of examples like this and selecting a subset where there is a general agreement on which side of the fence they lie is necessary for a productive discussion of the issue.

Are you asking for a procedure for identifying acts of free will (the doable kind of extensive definition) or a set of in-out exemplars (ostensive definition)?

-1Shmi
By extensional definition I mean fencing off the notion of free will with a set of reasonably sharp (close to the free will/not free will boundary) examples of not having free will. A rock not having free will is uncontroversial, but not sharp (very far from the boundary). I am looking for a set of examples where most people would agree that 1. It is an example of not having free will (uncontroversial) 2. It is hard to move it toward the "definitely free will" case without major disagreements from others (reasonably sharp).

Confused. What's incoherent about caring equally about copies of myself, and less about everyone else?

1Manfred
I don't think I said it was incoherent. Where are you getting that from? To expand on a point that may be confusing: indexically-selfish preferences (valuing yourself over copies of you) will get precommitted away if you are given the chance to precommit before being copied. Ordinary selfish preferences would also get precommitted away, but only if you had the chance to precommit sometime like before you came into existence (this is where Rawls comes in). So if you have a decision theory that says "do what you would have precommitted to do," well, you end up with different results depending on when people get to precommit. If we start from a completely ignorant agent and then add information, precommitting at each step, you end up with a Rawlsian altruist. If we just start form yesterday, then if you got copied two days ago you can be indexically selfish but if you got copied this morning you can't.

I've just finished marathoning the first 1.5 seasons (to the current cliffhanger/hiatus) of Gravity Falls, and strongly recommend it. Supernatural mystery/horror/comedy, significantly darker than Disney usually gets. High levels of continuity; very strong art direction; near-HPMOR levels of foreshadowing/conservation of detail (I advise not reading about it beforehand as there was a similar hivemind-predictive-success of the biggest twist). Secret codes, cryptic Reddit AMAs, trolling creators with hand puppets, all the good stuff.

Don't follow. You see "making an actually binding promise" as equivalent to dying?

0NancyLebovitz
I interpreted it to mean that people could no longer kill in self-defense, and there was no guarantee that they could be safe without ever killing in self-defense.
-2DanielLC
No. I'm saying Unbreakable Vows kill people who break them.
3Nornagest
I suspect the Unbreakable Vow is being parsed here as adding high-level terms to someone's utility function, and that that's being interpreted as equivalent to erasing the previous personality. I'm not so convinced, myself, neither that that's the right way to look at the spell nor that values are that tightly linked to... not sure what I want to call it. Personhood? Unique agency? Whatever we actually care about when we object to murder, anyway. EDIT: Never mind, after looking through a page or so of DanielLC's comments I think that sentence actually expands to "[presumptively] giving people the death penalty [for breaking their Vow] with no way to ask for [...] exemption[s..., etc]." Pretty sure that's not how the Vow works in Eliezer's world, though, after reading the bit where Harry undergoes it.

This seems odd to me, though I'm not saying you're wrong. From the inside, my values seem far more akin to habits or reflexes than to time-indexed memories.

I imagine Obliviated!me still having a NO DON'T reaction when asked to support a purpose opposed to my previous goals, because verbalised goals flow from wordless moral habits; not the other way around. (assuming a possibly inconsistent scenario where I retain enough language for someone to expect to manipulate me)

Quite a bit. I have a very bad memory for personal history anyway - I have a vague timeline of significant dates in my head, and a handful of random "vivid" memories, maybe one per year, that have been nailed down by neural happenstance. But if you asked me what I was doing yesterday evening, I think I would end up randomly selecting an evening from the last three or so - unless I painstakingly solved it in the manner of a logic puzzle ("I go to the gym on Wednesdays, and yesterday was Thursday, so I guess I was at the gym").

Rathanel's The Empty Cage (previously recommended on LW) and OmgImPwned's In Fire Forged. Can't remember if the first is finished, the second certainly isn't.

Waves Arisen is in a class by itself as regards sweet sweet ingroup jargon, however :)

Every rational!Naruto fic I encounter keeps topping the preceding ones - I suspect my head will implode if I ever attempt to read the canon story at this point.

The best one yet is The Waves Arisen. Everyone is very sensible, shadow cloning is more broken than ever, and patiently listening to giant slugs pays off in the end.

1Baughn
The only other one that springs to mind is the one with the Nine-Brained Kyuubi. Got any more?

Disagree; you're prematurely optimising. LW is full of dully worthy explanatory articles using the blueprint you describe; an attempt to communicate technical concepts by redundant array of overlapping metaphors is novel and fun even if it doesn't end up working well.

(Sure, it's summoned a few bizarre commenting entities, but never mind!)

2gjm
An interesting perspective -- but I don't get the impression that economy's actual goal is to make LW more entertaining by posting bizarrely unhelpful metaphorical explanations of elementary concepts in economics. So maybe my advice is good for achieving economy's goals but bad for making LW a more interesting place overall. [EDITED to add:] I infer from your other comments that you don't find economy's explanations bizarrely unhelpful; so of course we may disagree on what the tradeoffs are here.

I have high confidence, based on style, that I have read work you have published elsewhere; but on the default assumption that you don't want that context connected to this, I'll say no more.

0[anonymous]
Don't know what this could be, so am mildly curious. If it is also about economics, message me with guess....

Splendid. As inexplicably haunting as the rest of your work. Looking forward to more.

0[anonymous]
What other work...?

I just tried to (using the form at the bottom of the hpmor.com chapter) and it appeared to accept it, but I can't see it showing up on the FF.net reviews page. Is this the wrong way to do it? Is there a significant lag time?

EDIT: Never mind, there it is!

Leonhart
450

Here is my best attempt at a delaying tactic, after sleeping on it. Please tear apart/suggest better ways in which LV might tear apart, to replace the poor placeholder responses he has here.

--

"Agree that I musst die, if it ssavess world. But thiss iss not besst way to kill me. Ssee how you can benefit more, given your goalss."

"Explain."

"Believe power you know not doess refer to power to desstroy life-eaterss. Life-eaterss will find you eventually, teacher. Know you. Will hunt you down, ssomeday. Eat all of you, all of world and mag... (read more)

1LEmma
Considering Harry might destroy the world, and this might be the very way he does it, why not let Hermione take care of them?

Really like that one. My first reaction was "and yet the Gatekeeper can still say no and kill you". After all, Voldemort's trying to prevent untold destruction, a prophecy whose exact paths to possible fulfilment are a mystery. Killing a limited number of Dementors is less important.

But my understanding of the AI box experiment is that it was never just about finding an argument that will look persuasive to someone armchair-thinking about it. It's about finding an opening to the psyche, an emotional vulnerability specific to your current target. ... (read more)

6Jack_LaSota
If a Confundus can fool the Mirror, it can fool the true Patronus charm. If Hermione can eventually kill any Dementors, she can eventually kill all of them. Finding more people who can cast the true Patronus, and letting them handle an eventual end of the world scenario is a much smaller problem than a prophecy of doom.
3Alsadius
That's not a win, but I think it's the best loss possible.
8JoshuaZ
Please post this one as a review.

I saw the thread title and assumed "Maletopia" was a Disney AU fanfic about a perfect society run by rational!Maleficent. Disappointed now.

I don't like to frustrate the poor databases' telos, it is not at fault for the use humans put its data to.

(Yes, I realise this is silly. It's still an actual weight in the mess I call a morality; just a small one.)

4Lumifer
The database is only techne in that context, its own telos lies in maintaining nice tables and properly responding to queries -- things I do not mess with :-)

I misremembered, you are correct. I was possibly instead frustrated with finding a temporary email that it would accept (they block the most common disposables I think).

Not speaking for above poster: because that's not actually trivial - you need a real fake phone number to receive validation on, etc. Also, putting fake data into a computer system feels disvirtuous enough to put me off doing it further.

0Lumifer
Interesting. I consider poisoning big surveillance/marketing databases to be virtuous X-D
4Elo
facebook still does not have my phone number. Not sure what you did to need a phone number verification...

Yes, for a copy close enough that he will do everything that I will do and nothing that I won't. In simple resource-gain scenarios like the OP's, I'm selfish relative to my value system, not relative to my locus of consciousness.

0ike
So we have different models of selfishness, then. My model doesn't care about anything but "me", which doesn't include clones.

Delicious reinforcement! Thank you, friend.

Ah, I see. We may not disagree, then. My angle was simply that "continuing to agree on all decisions" might be quite robust versus environmental noise, assuming the decision is felt to be impacted by my values (i.e. not chocolate versus vanilla, which I might settle with a coinflip anyway!)

In the OP's scenario, yes, I cooperate without bothering to reflect. It's clearly, obviously, the thing to do, says my brain.

I don't understand the relevance of the TPD. How can I possibly be in a True Prisoner's Dilemma against myself, when I can't even be in a TPD against a randomly chosen human?

0ike
OP is assuming selfishness, which makes this True. Any PD is TPD for a selfish person. Is it still the obvious thing to do if you're selfish?

Do you really think your own nature that fragile?

(Please don't read that line in a judgemental tone. I'm simply curious.)

I would automatically cooperate with a me-fork for quite a while if the only "divergence" that took place was on the order of raising a different hand, or seeing the same room from a different angle. It doesn't seem like value divergence would come of that.

I'd probably start getting suspicious in the event that "he" read an emotionally compelling novel or work of moral philosophy I hadn't read.

0ike
If we raised different hands, I do think it would quickly cause us to completely diverge in terms of how many body movements are equal. That doesn't mean we would be very different, or that I'm fragile. I'm pretty much the same as I was a week ago, but my movements now are different. I was just pointing out that "decisions" isn't that much more well defined than what it was coming to define (divergent). In a True Prisoner's Dilemma, or even in situations like the OP? The divergence there is that one person knows they are "A" and the other "B", in ways relevant to their actions.

Assuming we substitute something I actually want to do for hang-gliding...

("Not the most fun way to lose 1/116,000th of my measure, thanks!" say both copies, in stereo)

...and that I don't specifically want to avoid non-shared experiences, which I probably do...

("Why would we want to diverge faster, anyway?" say the copies, raising simultaneous eyebrows at Manfred)

...that's what coinflips are for!

(I take your point about non-transferability, but I claim that B-me would press the button even if it was impossible to share the profits.)

0Manfred
I think that's a totally okay preference structure to have (or to prefer with metapreferences or whatever).

I am confident that, in this experiment, my B-copy would push the button, my A-copy would walk away with 60 candies, and shortly thereafter, if allowed to confer, they would both have 30. And that this would happen with almost no angst.

I'm puzzled as to you why you think this is difficult. Are people being primed by fiction where they invariably struggle against their clones to create drama?

4Manfred
Hm, this points out to me that I could have made this post more stand-alone. The idea was that you eat the candy and experience a non-transferrable reward. But let me give an example of what I mean by selfish preferences. If someone made a copy of me and said they could either take me hang-gliding, or take my copy, I'd prefer that I go hang-gliding. Selfishly :P

You're thinking of this one, and he cited Carrier, and we have this argument after every survey. At this point it's a Tradition, and putting "ARGH LOOK JUST USE CARRIER'S DEFINITION" on the survey itself would just spoil it :)

0[anonymous]
Oh yeah, that one. I'd probably just get annoyed if they said to use Carrier since I hate that definition, so I guess the status quo works for me.

Ah, yes. I read that page and scrunchyfaced, back when Scott posted the map. (Although I seem to remember reading other things on the same blog that were better thought out, so maybe the author was having an off day.)

I hope that something more rigorous and interesting comes along. The defensible heart of the position, it seems to me, could be something along the lines of "Yes, we must be ready to relinquish our beliefs with the slightest breath of the winds of evidence. But exactly so long as we do believe A, let's really believe it. Let's not deny ourselves the legitimate Fun that can reside in savouring a belief, including any combination of robes and chanting that seems appropriate."

Upvoted for informing me that "straight and narrow" was a malformation. Also, yes.

I want to be friends with the write-in worshiper of CelestAI mentioned :) PM if you like!

3Luke_A_Somers
I'd be on board with Celestia, but CelestAI? No.
0MathiasZaman
I think I know who that would be, but would like my suspicions confirmed. I should probably ask them.
Leonhart
190

Data point: I picked this option, because of a grab-bag of vaguely related positions in my head that make me feel dissatisfied with the flat "atheist" option, including:

  • I enjoy and endorse rituals such as the Solstice celebration, as opposed to the set here who are triggered by them (ETA: not in any way claiming they are wrong to be so triggered, or don't have reasons)
  • I find the Virtues, and other parts of the Sequences with similar styling, to be deeply moving and uplifting, and consider this element of our house style to be a strength rather
... (read more)
4covaithe
I quite like this formulation, and if I had thought of it at survey time I might well have answered 'atheist(spiritual)' instead of 'atheist(nonspiritual)'. Regarding emotional benefits: I sing in moderately serious classical choirs, where inevitably much of the music is set to religious texts. I get some but not all of the emotional benefits from this that I used to get from religious worship, back when I was a committed theist. I think I would get more benefits if the texts were not religious, and still more if the texts were humanist / rationalist / expressed beliefs that I actively profess.
2Kaj_Sotala
There's Postrationality, Table of Contents, though the author hasn't written any follow-up posts yet.

I checked regs, seems we're all good: http://www.food.gov.uk/business-industry/imports/want_to_import/personalimports

"Providing the food parcel you wish to send is to a private, named individual and contains no meat and meat products, dairy products or any particular restricted products (for example Kava kava, which is not permitted either as a personal import or a commercial import) you may send a reasonable amount for personal consumption."

I'd love to try them, but am in the UK. Happy to cover the additional postage cost!

0John_Maxwell
Hi everyone, just FYI we aren't currently shipping MealSquares internationally. So unfortunately even if you like them we aren't set up to send regular shipments to the UK :( (also we're working on individual packaging + smaller sample packs so this sort of negotiation isn't required for people who just want a taste, and we read all the feedback in this form, criticism is very welcome)
1Princess_Stargirl
Unless you two live VERY close I would prefer to just ship it to you individually. I don't want you to have to drive around or anything. So yeah Leonhart and philh just send me your addresses via pm.
4philh
Ditto. Possibly cheaper if Stargirl would send two batches to one of us, we split the cost and exchange IRL? (Do you know if there are any relevant rules about shipping foodstuffs overseas?)

It's pony time, I'm afraid.

My Little Economy: Economics is Science and its sequelae.

"It's the NGDP Targeting Festival in Ponyville," Twilight said. "I'll have a miserable time trying to explain monetary theory to a bunch of hicks and then come home. What's the worst that could happen?"'

Really good - perhaps the best compromise between the needs of characterisation, parable, and comedy I've ever seen. Seems like it should be accessible to people who haven't seen MLP.

ETA: The author seems to have randomly deleted all hir blog posts, made... (read more)

I believe it doesn't work like this; you need the circulatory system in order to perfuse the head, and in doing so the other organs are compromised. This could probably be avoided, but not without more surgical expertise/equipment than today's perfusion teams have, I think.

0skeptical_lurker
Oh, because the cryoprotectant is toxic. I forgot about that. I suppose other internal organs apart form the heart could be removed before perfusion starts, but the Alcor people are not qualified to officially do this. All in all it seems like the sort of problem which would be solved if cryonics ever became big enough that it created a sufficient shortage of organs that hospitals actually dedicated some resources to solving the problem.

Smiles, laughter, hugging, the humming or whistling of melodies in a major key, skipping, high-fiving and/or brofisting, loud utterance of "Huzzah" or "Best thing EVER!!!", airborne nanoparticles of cake, streamers, balloons, accordion music? On the assumption that the AI was not explicitly asked to produce these things, of course.

I think the intuitive surface reading of that post (supernatural objects are black boxes; they have state, but are denied to have internal structure that implements the state) at least makes it clear that simulators are not "supernatural" under this definition. Which is the actual query people were blocking on. But evidently many people read the post differently.

Leonhart
250

Man, I'm late this year. Taken. To save my index finger, just upvoted everyone who took it in November :)

Next time, the "supernatural" question really needs to just link to the Sequence post defining the word.

6Azathoth123
I've read the sequences. I still don't think the concept is clearly defined.

The first option reads "Moral statements don't express propositions and can neither be true nor false." I'm curious what else you wanted. The second clause without the first?

I was mostly irked that "the position from the Sequences" wasn't an option (although I quite understand why you'd want to avoid parochial signalling), as neither your definition of subjectivist nor substantive realist seemed to capture it adequately. I eventually opted for the latter.

Exactly the same misreading here.

Leonhart
-10

viewer-hostile

Wow, you're still sore over Endless Eight? I thought it one of the finer pieces of trolling ever indulged in by a commercial product. :)

0lmm
Oh, I appreciated it, I just feel a certain amount of warning is in order. (What a shame the gg-commie joint fell apart before then. What a troll confluence that could've been)

Sora no Woto. The K-On! archetypes are traumatised child soldiers in an uneasy-interwar-period in bizzaro alternate Switzerland, and they have a pet owl. Scenery is amazing.

I would naturally say ikes-hee, but I believe it's supposed to be ay-eye-zy (or maybe ay-eye-kzy)?

I am generally in favour of a long-term ruler AI; though I don't think I'm the one you heard it from before. As you say, though, this is an area where we should have unusually low confidence that we know what we want.

Leonhart
170

Learning to lucid dream, from everything I've read on the subject, involves progressively defeating whatever mechanism usually provides amnesia on waking. Having too much access to memories of nonexistent events seems an epistemically unsafe thing. I have one or two memories from a lifetime of dreaming, and I cannot distinguish them from life memories by any individual texture or quality; only by the fact that they don't cohere with my other memories. This scared me greatly.

3KaceyNow
Improving dream recall isn't necessarily important for lucid dreaming -- I practiced lucid dreaming for some years without any explicit attention to it. I can imagine ways it would be helpful: analyzing your dreams will help you recognize when you are dreaming, plus there's not much point to a lucid dream if you don't remember it. My fears are more on the opposite side of things; some people advocate lucid dreaming methods where you slip directly from wake to lucid dream, but this requires passing through some rather terrifying states of consciousness I can't bring myself to intentionally experience.

No such accusation intended! In all honesty, my thought process was "Guvf fgbel erpncvghyngrf gur svany gevyrzzn (nf lbh fnl, ybbc/tebj/qvr) bs Pnryrz rfg Pbagreeraf, juvpu vf nyernql xabja gb cbffrff RL-puvyyvat cebcregvrf; lbh pbaqrafr vg irel rssrpgviryl, naq gura lbh unir Pryrfgvn rpub bar bs gur zber ubcrshy Sha Gurbel cbfgf jvgu 'Vg znl jryy or gung n zber pbagebyyrq pyvzo hc gur vagryyvtrapr gerr vf cbffvoyr'; naq gura Gjvyvtug erwrpgf vg." I just read it as very pointed, which clearly was not the intended reading.

I can't dispute your cla... (read more)

Haven't had time to read it; but from the story description, it seems to be a comic affair where Twilight decides to monetise her teleportation skillz, and picks the wrong word to advertise with. Hilarity presumably prevails?

1Vaniver
Yep.
5Richard_Kennaway
Pretty much. I stopped reading at the point where her first "client" showed up, with supposed "hilarity" about to begin, as I can't stand comedy based on misunderstanding and embarrassment.

These make me sad, but not in an objectionable way. Liked and Follow'd. Good Night seems specifically optimised to chill EY, was it your goal?

I am a bit puzzled by one aspect of Good Night, but that may be because I don't understand the tech level that the characters are operating at. In Twilight's place, it seems that the obvious thing to do would be to znxr n pbcl bs urefrys jvgu gur nccebcevngr oberqbz-erqhpgvba arhebzbqvsvpngvba, naq yrnir vg gb xrrc Pryrfgvn pbzcnal. Vs guvf vf cbffvoyr va gur frggvat, V qba'g frr jul guvf vfa'g n pyrne jva; fvapr Gjv... (read more)

3jaime2000
Oh, good heavens no! The thought that Mr. Yudkowsky would ever read the story did not even occur to me until long after it was finished. At the level of magitek I envisioned the characters having, your solution should definitely be possible. The realistic answer is that the prompt gave us twelve hours of prep time and one hour of writing time; I did not think of your idea during the allotted time, and if I had I would have mercilessly cut it at the planning stage so that I could fit the whole story into one hour. Even disregarding the time limit, rnpu nethzrag V unq Pryrfgvn naq Gjvyvtug qvfphff jnf n fvatyr, ovt, eryngviryl fvzcyr pbaprcg; vzzbegnyf zhfg zbqvsl fb gung gurl pna rgreanyyl ybbc, be gurl zhfg tebj, be gurl zhfg qvr. Your idea is more complex, and it doesn't fit the theme. If you had handed me a beautifully written section which covered the whole issue in three paragraphs while I was writing, I would have had no choice but to murder it for the sake of the story as a whole. Literary concerns aside, my Twilight would disagree with the notion that lbh pna pubbfr juvpu vafgnaprf bs lbh lbh fhowrpgviryl rkcrevrapr onfrq ba jurgure lbh vqragvsl jvgu gurz be abg.
Load More