Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: NancyLebovitz 01 February 2015 06:04:19PM 1 point [-]

Warnings would make sense to me if the work is marketed or recommended in ways which would make rationalists likely to pick it up.

Comment author: gwern 01 February 2015 05:55:52PM 1 point [-]

Everything is heritable:

Politics/religion:

Statistics/AI/meta-science:

Psychology/biology:

Technology:

Economics:

Philosophy:

Comment author: gwern 01 February 2015 05:55:37PM 0 points [-]
Comment author: gwern 01 February 2015 05:55:20PM 0 points [-]
In response to comment by dxu on My Skepticism
Comment author: G0W51 01 February 2015 05:55:02PM 0 points [-]

I used incorrect terminology. I should have asked why I should have axioms.

Comment author: gwern 01 February 2015 05:54:37PM 0 points [-]
Comment author: gwern 01 February 2015 05:54:20PM 0 points [-]

Misc:

Touhou:

Doujin:

Kantai Collection:

Vocaloid:

In response to comment by Eitan_Zohar on My Skepticism
Comment author: G0W51 01 February 2015 05:51:34PM 0 points [-]

Why? I realize that Yudkowsky isn't the most coherent writer in the universe, but how the heck did you get from here to there?

I' afraid we're not on the same page. From where to where?

A simple qualia-based argument against skepticism (i.e. percepts are simply there and can't be argued with) is problematic- even if you conceded direct knowledge of percepts, you couldn't really know that you had such knowledge. They do not deal with rationality and there aren't any premises you could create from them. It seems less of a foundational tree of justification than a collection of meaningless smells, sounds and colors.

I understand that believing in qualia is not sufficient to form sizable beliefs, but it is necessary, is it not?

Comment author: G0W51 01 February 2015 05:48:27PM 0 points [-]

Because I act as if I'm not skeptical. (Of course, I can't actually know that the last sentence as true, or this statement, or this, and so on).

Comment author: LawrenceC 01 February 2015 05:45:36PM *  0 points [-]

I tried making one just for the math behind rationality/decision theory back in October, but I never got around to finishing it. The main problems I ran into were:

  • Where should the skill tree start? I'm sure that basic math like algebra, geometry, trig, etc are all really useful, but I'm not sure about the dependencies between them. I ended up lumping them all into "basic mathematics".

  • How should the skill tree split subjects? Many subjects are best learned iteratively - for example, it's probably best to get a rudimentary understanding of probability theory, then learn more probability theory later on once you've picked up other related subjects (Linear Algebra, Multivariate Calculus, etc) and then again after more subjects (Measure theory). The complication is that these other subjects are often split into different "levels". I found that I didn't have enough familiarity with math to split subjects naturally.

One method that seems promising is taking a bunch of textbooks/courses, and trying to figure out the dependencies between them.

In response to comment by ChristianKl on My Skepticism
Comment author: G0W51 01 February 2015 05:45:29PM 0 points [-]

By realizing that the aforementioned premises seem necessary to prove anything.

In response to comment by g_pepper on My Skepticism
Comment author: G0W51 01 February 2015 05:44:46PM 0 points [-]

But how can I know that I'm acting?

Comment author: dxu 01 February 2015 05:44:44PM *  0 points [-]

So you're saying that taking a few background axioms that are pretty much required to reason... is equivalent to theism.

I think you may benefit from reading The Fallacy of Grey, as well as The Relativity of Wrong.

Comment author: JoshuaZ 01 February 2015 05:43:01PM *  0 points [-]

Question: Does it make sense to post in these threads "I read X, don't bother for the following reasons" or should we only restrict to recommendations of what to read/listen/watch?

Comment author: James_Miller 01 February 2015 05:42:57PM *  1 point [-]

I think Putin wants to rebuild the Soviet Empire and is going to keep taking more territory until he encounters serious resistance, and this resistance could easily turn into a war in which nuclear weapons are used. Putin might rationally calculate that if he tried to conquer Finland (which used to be part of the Czarist Empire) there is only a 10% chance that the U.S. would put up serious resistance, and this was a gamble he would be willing to take. But if the U.S. did decide to fight it would easily beat back Russia if the war stayed conventional, and this might cause Putin to use atomic weapons.

Putin probably calculates that Obama is much less likely to use military force against him than the next U.S. President will be, so we might be entering a period of great danger.

Comment author: dxu 01 February 2015 05:41:34PM 0 points [-]

In what meaningful sense are those two phrasings different?

Comment author: JoshuaZ 01 February 2015 05:41:26PM 0 points [-]

Just read Brandon Sanderson's "Firefight" which is the sequel to Steelheart. Sanderson is as amazing as usual. The books are a very novel take on the idea of superpowers.

Comment author: JoshuaZ 01 February 2015 05:40:28PM 0 points [-]

Read Christopher Nuttall's "Schooled in Magic" series. I'd describe it as HPMoR but with a main character who is a) slightly more mature and b) not nearly as smart or educated. Overall, while I've had a mixed view of a lot of Nuttall's other works I have a high opinion of this one.

Comment author: JoshuaZ 01 February 2015 05:40:23PM 0 points [-]

Currently reading Annie Bellet's "The 20-Sided Sorceress" series which is an urban fantasy setting where the main character grew up thinking of magic in terms of Dungeons and Dragons and uses a D20 talisman to focus her magic. Not too surprisingly it is full of geek-culture references. Overall, amusing.

In response to comment by ike on My Skepticism
Comment author: G0W51 01 February 2015 05:38:02PM 0 points [-]

Though the linked article stated that one only needs to believe that induction has a non-super-exponentially small chance of working and that a single large ordinal is well-ordered, but it did really justify this. It spoke nothing about why belief in one's percepts and reasoning skills is needed.

Comment author: Vaniver 01 February 2015 04:54:44PM 0 points [-]

I feel like the sibling comment gives some idea of that, but I'll try to explain it more. If you have a collection of worlds, in order to get their probabilistic expectations to line up with experiment you need conditional fractions to hold: conditioned on having been in world A, I am in world B after t time with probability .5 and in world C after t time with probability .5. But the number of worlds that look like B is not constrained by the model, and whether the worlds are stored as "A" or the group of ("AB", "AC") also seems unconstrained (the nonexistence of local variables is different; it just constrains what a "world" can mean).

And so given the freedom over the number of worlds and how they're stored, you can come up with a number of different interpretations that look mathematically equivalent to me, which hopefully also means they're psychologically equivalent.

In response to comment by G0W51 on My Skepticism
Comment author: g_pepper 01 February 2015 04:52:07PM 0 points [-]

I am instead acting as if I thought I was thinking

It seems to me that this statement implies your existence; after all, the first two words are an existential declaration.

Furthermore, just as (per Descartes) cognition implies existence, so it would seem that action implies existence, so the fact that you are acting in a certain way implies your existence. Actio, ergo sum.

Comment author: Alsadius 01 February 2015 04:39:14PM 0 points [-]

Joachim has the relevant bits. I came up with an new term for it, though, because I don't know of a better way to refer to it.

Comment author: Ishaan 01 February 2015 04:30:13PM *  0 points [-]

I'm persuaded of it, but is it really true? I approve of it, but is it really good?

Yes, that is precisely the relevant question - and my answer is that there's no non-circular justification for a mind's thoughts and preferences (moral or otherwise), and so for both practical and theoretical purposes we must operate on a sort of "faith" that there is some validity to at least some of our faculties, admitting that it is ultimately just simple faith that lies at the bottom of it all. (Not faith as in "believing without evidence", but faith as in "there shall never be any further evidence or justification for this, yet I find that I am pursuaded of it / do believe it is really good.)

The simple version of internalising truth and goodness by rubber stamping prevailing attitudes is not satisfactory; the complex version......is complex.

It's not really that bad or complex - all you have to believe in is the concept of justification and evidence itself. To attempt to go deeper is to ask justification for the concept of justification, and evidence for what is or is not evidence, and that's nonsense.

Comment author: UnrequitedHope 01 February 2015 04:28:18PM 0 points [-]

My grand plan is starting now - with an extreme epistemic update I was in denial about. Hah. And I thought I left those things behind me!

Comment author: UnrequitedHope 01 February 2015 04:11:36PM *  0 points [-]

Interesting isn't the correct way to describe it - it's simply functional, and in terms of bandwidth, more economical. Serves the machine and the people. Give your AI a shot of that!

I could honestly try to implement it but I'm not sure I have the right skills to make it work beautifully - I place an emphasis on a job-well-done and I feel like I'd just make the site worse overall than someone who does have the technical aptitude to actually implement it.

I hate being the UX guy and hope I could get better in this year.

A honest question - has nobody ever thought of this before? Heh. Optimize everything except the site you learned rationality from? MIRI could make an AI paper about that!

EDIT: I will do this anyway - a wise person who's also a programmer told me that if you have the right mindset interesting problems will find you so I'm definitely going to pull some hair in an attempt to do it. I just hope I'm not going to run into licensing issues, I'm going to release my heck of a hack in a freedom-respecting license, so if there's a problem, I'll just say Reddit sucks.

Comment author: adamzerner 01 February 2015 04:08:16PM *  1 point [-]

I recently watched the first season of the show Homeland and really liked it. The drama and action were good, but what I liked in particular is that the situations had some moral depth to them, and it isn't perfectly clear who to like/what to root for. I don't want to give anything away, but basically the story line revolves around a recently rescued POW who may have been "turned". If you've seen the show and want to talk about it, message me!

Comment author: Vaniver 01 February 2015 04:01:57PM 0 points [-]

That does seem like an interesting feature! There are resources for making changes to the LW codebase, which are much more likely to result in an actual change than submitting a feature request.

Comment author: polymathwannabe 01 February 2015 04:00:15PM *  0 points [-]

I only watched the episode last weekend, and I enjoyed it very much, except for the part where they're discussing the ways AIs can conclude it's in their best interest to kill us and they're having that conversation in front of Bella, which struck me as a particularly stupid thing to do.

In response to comment by Ishaan on My Skepticism
Comment author: TheAncientGeek 01 February 2015 03:58:00PM *  0 points [-]

However, our concepts of truth and goodness allow us to noise the questions:standard responsesIsm persuaded of it, but is it really true? I approve of it, but is it really good?

The simple version of internalising truth and goodness by rubber stamping prevailing attitudes is not satisfactory; the complex version......is complex.

Comment author: JoshuaZ 01 February 2015 03:51:53PM *  1 point [-]

Essentially that large-scale, complicated alliances can result in small scale wars unexpectedly spiraling out of control. In the case of WWI there were multiple small conflagrations in the Balkans before WWI but it then took just the right one to set it off. In a similar vein, one wouldn't be surprised if one the similar small conflagrations around Russia like are currently happening leads to a Russia v. NATO war with little warning. Similarly, one could expect a similar situation in the Pacific given the many border conflicts there.

Comment author: JoachimSchipper 01 February 2015 03:51:17PM *  2 points [-]
  • Batman is a murderer no less than the Joker, for all the lives the Joker took that Batman could've saved by killing him. ch. 85
  • "It's not fair to the innocent bystanders to play at being Batman if you can't actually protect everyone under that code." ch. 91
  • Harry had no intention of saying it out loud, of course, but now that he'd failed decisively to prevent any deaths during his quest, he had no further intention of being restrained by the law or even the code of Batman.ch. 97.
Comment author: UnrequitedHope 01 February 2015 03:49:59PM *  1 point [-]

Tried making a blog and it wouldn't let me because "karma". Drafts can't be publicly read either so this is the best I can do.

Can we please have a feature where I can opt to instead of going through user XYZ's posts, I can just see the title and choose the one I want (or was looking for?)

So it'll be like, instead of:

XYZ's posts

[Title]

[TEXT]

[TEXT]

[TEXT]


[REPEAT]

It'll be

[TITLE WITH LINK]

[TITLE WITH LINK]

[TITLE WITH LINK]

[REPEAT AD EXHASTIUM]

Basically just like the sequences, where you have links to the posts themselves rather than the whole damn thing in one page.

And in the case of blogs, make it so that once you have 20 positive karma regardless of your negative karma. I guess you can sharpen this better than me because I'm probably not going to make a serious post (or one that will be taken seriously) ever In the case of drafts, make them unlisted and simply let people the ability to link to their own draft and let other view and comment on it.

Comment author: NancyLebovitz 01 February 2015 03:45:38PM 1 point [-]

Could you two go into some specifics of what you're expecting and why?

Comment author: TheAncientGeek 01 February 2015 03:44:31PM *  0 points [-]

Thats one of the standard responses to scepticism:

Tu quoque, or performative contradiction.

In response to comment by Eitan_Zohar on My Skepticism
Comment author: Ishaan 01 February 2015 03:33:59PM *  0 points [-]

At first I just made it up, feeling that it was appropriate name due to the many parallels with moral nihilism, then I googled it, and description that came up roughly matched what I was talking about, so I just went on using it after that. I'm guessing everyone goes roughly through that process. Normally I add a little disclaimer about not being sure that if it is the correct term, but I didn't this time.

I didn't know the term "philosophical skepticism", thanks for giving me the correct one. In philosophy I feel there is generally problem where the process of figuring out the names that other people who separately came up with your concept before you did use to describe the concept you want ends up involving more work and reading than just re-doing everything...and at the end of the day others who read your text (as if anyone is reading that closely!) won't understand what you meant unless they too go back and read the citations. So I think it's often better to just throw aside the clutter and start fresh for everything, doing your best with plain English, and it's okay if you redundantly rederive things (many strongly disagree with me here).

I feel that the definition of "Epistemic nihilism" is self evident as long as one knows the words "epistemic" and "nihilism". The term "Skepticism" implies the view that one is asking "how do you know", whereas nihilism implies that one is claiming that there is no fundamental justification of the chosen principles. If indeed I'm describing the same thing, I kinda think "epistemic nihilism" is a more descriptive term from a "plain english" perspective overall.

(Also, re: everyone - I haven't actually seen that term used in the wild by people who are not me unless explicitly googling it. Maybe your impression results from reading my comments somewhere else?)

Comment author: James_Miller 01 February 2015 03:30:58PM 1 point [-]

Reading it has made me strongly update to the probability that there will be a large-scale war in the next few years

Studying WWI has done the same for me.

Comment author: JoshuaZ 01 February 2015 03:21:13PM *  0 points [-]

Currently reading Christopher Clark's "The Sleepwalkers: How Europe Went to War in 1914" so far the book has two upshots: nationalism makes people really irrational. No, you think you already know that, but you don't realize how far it can go. No, more irrational than that. Second, it is really easy for things to spiral out of control. Reading it has made me strongly update to the probability that there will be a large-scale war in the next few years, and in general that war might be a major aspect of the Great Filter. A relevant, somewhat critical review of the book is here.

Comment author: JoshuaZ 01 February 2015 03:14:03PM *  1 point [-]

Ishtvan's entire approach is so ethically disturbing and PR-toxic, it really doesn't help the transhumanist movement to promote him. The first and second of his "Transhumanist Laws" basically amount to just screaming "I defect!" repeatedly. This is unfortunate because he's a decent writer and he also does on occasion make interesting points that I haven't seen elsewhere.

In response to comment by dxu on My Skepticism
Comment author: TheAncientGeek 01 February 2015 03:00:51PM 0 points [-]

It may be unacceptable to ask for justification of axioms, but that does not make it acceptable to assume axioms without justification.

Comment author: ChristianKl 01 February 2015 02:50:10PM 0 points [-]

I don't think academic research has to focus on diseases in the first place. I would appreciate if more money would be invested into finding better ways to measure drug toxicity levels. That's no disease and therefore it's underfunded.

DNA sequencing is a success story of the last decades. It's no disease itself but was a worthwhile investment.

It makes sense to spend money research AIDS not only because curing AIDS is a good idea. AIDS patients are a population where you can ethically try a lot of high risk interventions for interacting with the human immune system.

Comment author: ChristianKl 01 February 2015 02:48:10PM *  0 points [-]

But I have no good explanation for the overfunding of HIV which is a completely preventable disease on the personal level by using a condom and refraining from using IV drugs.

Condom usage reduces the changes of getting infected via sex by ~90% not 99.9%.

Comment author: advancedatheist 01 February 2015 02:31:11PM 0 points [-]

What Can Supporters Do for Transhumanism? by Zoltan Istvan:

http://www.huffingtonpost.com/zoltan-istvan/what-can-supporters-do-fo_b_6564536.html

Zoltan suggests changing careers, assuming that you can find ones which allow for feasible transhumanist-sounding projects to work on.

But I would add the fallback one of just figuring out how to make and save a lot more money to give you the resources to do some of these things.

Comment author: polymathwannabe 01 February 2015 02:28:28PM 0 points [-]

And this is a terribly important topic how... ?

Comment author: advancedatheist 01 February 2015 02:14:18PM 0 points [-]
In response to comment by Eitan_Zohar on My Skepticism
Comment author: RichardKennaway 01 February 2015 01:51:06PM 0 points [-]

"Epistemic nihilism" is not a name, but a description. Philosophical skepticism covers a range of things, of which this is one.

In response to My Skepticism
Comment author: ChristianKl 01 February 2015 01:36:34PM 0 points [-]

Standard methods of inferring knowledge about the world are based off premises that I don’t know the justifications for.

How do you come to that conclusion?

Comment author: Plasmon 01 February 2015 01:27:27PM 1 point [-]
Comment author: ChristianKl 01 February 2015 12:58:04PM 0 points [-]

No, I didn't say that it's his fault. The main point is that if your goal is raising productivity of intellectuals it's not clear that getting them girls is helpful.

There the Xkcd comic about Debian developer productivity: http://xkcd.com/306/

Comment author: skeptical_lurker 01 February 2015 12:17:47PM 0 points [-]

EY for better or worse is generally quite verbose, although out of the whole of HPMOR the only part I felt was superfluous was Hermine vs the bullies.

In response to comment by Eitan_Zohar on My Skepticism
Comment author: gjm 01 February 2015 12:13:29PM 0 points [-]

Maybe I wasn't clear: I'm questioning whether the premise of your question

Why does everyone refer to it as "epistemic nihilism"?

is correct. I don't think everyone does refer to it that way, whether "everyone" means "everyone globally", "everyone on LW", "everyone in the comments to this post", or in fact anything beyond "one or two people who are making terminology up on the fly or who happen to want to draw a parallel with some other kind of nihilism".

Comment author: skeptical_lurker 01 February 2015 12:13:20PM 0 points [-]

Sure, if you think you have a really good read of the author. But as I said, all Horocruxes are accounted for, and as gjm said, there is a simpler explanation, and so I'm sticking by my opinion that Carrow probably isn't a Horocrux, even if he does show up later.

In response to comment by patchClamp on My Skepticism
Comment author: TheAncientGeek 01 February 2015 11:56:48AM *  0 points [-]

Addresses rather than resolves. There are many responses, but no universally satisfactory ones.

Comment author: DanielVarga 01 February 2015 11:55:58AM 0 points [-]

Nice. If we analyze the game using Vitalik's 2x2 payoff matrix, defection is a dominant strategy. But now I see that's not how game theorists would use this phrase. They would work with the full 99-dimensional matrix, and there defection is not a dominant strategy, because as you say, it's a bad strategy if we know that 49 other people are cooperating, and 49 other people are defecting.

There's a sleight of hands going on in Vitalik's analysis, and it is located at the phrase "regardless of one’s epistemic beliefs [one is better off defecting]". If my epistemic belief is that 49 other people are cooperating, and 49 other people are defecting, then it's not true that defection is my best strategy. Of course, Vitalik's 2x2 matrix just does not allow me to have such refined epistemic beliefs: I have to get by with "attack succeeds" versus "attack fails".

Which kind of makes sense, because it's true that I probably won't find myself in a situation where I know for sure that 49 other people are cooperating, and 49 other people are defecting, so the correct game theoretic definition of dominant strategy is probably less relevant here than something like Vitalik's "aggregate" version. Still, there are assumptions here that are not clear from the original analysis.

In response to comment by dxu on My Skepticism
Comment author: TheAncientGeek 01 February 2015 11:46:49AM *  -1 points [-]

Your guesses are about right:.

The significance is that if rationalists respond to sceptical challenges by assuming what they can't prove, then they are then in the same position as reformed epistemology. That is, they can't say why their axioms are rational, and can't say why theists are irrational, because theists who follow RE are likewise taking the existence of God as something they are assuming because they can't prove it: rationalism becomes a label with little meaning.

In response to comment by gjm on My Skepticism
Comment author: Eitan_Zohar 01 February 2015 11:21:17AM *  0 points [-]

In general. I hear the word used but I haven't ever encountered it in literature (which isn't very surprising since I haven't read much literature). Seriously, Google 'epistemic nihilism' right now and all you get are some cursory references and blogs.

Comment author: ArisKatsaris 01 February 2015 11:04:02AM 0 points [-]

Short Online Texts Thread

Comment author: ArisKatsaris 01 February 2015 11:03:59AM 0 points [-]

Online Videos Thread

Comment author: ArisKatsaris 01 February 2015 11:03:55AM 0 points [-]

Fanfiction Thread

Comment author: ArisKatsaris 01 February 2015 11:03:52AM 0 points [-]

Nonfiction Books Thread

Comment author: ArisKatsaris 01 February 2015 11:03:48AM 0 points [-]

Fiction Books Thread

Comment author: ArisKatsaris 01 February 2015 11:03:44AM 0 points [-]

TV and Movies (Animation) Thread

Comment author: ArisKatsaris 01 February 2015 11:03:41AM 0 points [-]

TV and Movies (Live Action) Thread

Comment author: ArisKatsaris 01 February 2015 11:03:36AM 0 points [-]

Music Thread

Comment author: ArisKatsaris 01 February 2015 11:03:33AM 0 points [-]

Podcasts Thread

Comment author: ArisKatsaris 01 February 2015 11:03:27AM 0 points [-]

Other Media Thread

Comment author: ArisKatsaris 01 February 2015 11:03:22AM 0 points [-]

Meta Thread

In response to My Skepticism
Comment author: RichardKennaway 01 February 2015 10:27:33AM 0 points [-]
In response to My Skepticism
Comment author: RichardKennaway 01 February 2015 09:55:17AM 0 points [-]

With such skepticism, how are you even able to write anything, or understand the replies? Or do anything at all?

In response to comment by Eitan_Zohar on My Skepticism
Comment author: gjm 01 February 2015 09:33:45AM 0 points [-]

Everyone? In this discussion right here, the only occurrences of the word "nihilism" are in Ishaan's comment and your reply?

Comment author: Manfred 01 February 2015 09:31:59AM *  0 points [-]

Yeah, and if you have hair clippers, they're pretty foolproof - except for trimming around ears, smoothing transitions between different lengths, and dealing with the back of the neck. Which all just seem like garden-variety acquirable skills. It's plausible that I should already own hair clippers and be using them on myself, and that I'm only not doing this because of trivial inconveniences. Hm.

Comment author: passive_fist 01 February 2015 09:24:18AM 0 points [-]

But it's definitely not known, at least not to me, that "people find low status repulsive." At the very least, I'd appreciate some evidence backing this up.

Comment author: RobbBB 01 February 2015 08:28:55AM 2 points [-]

Yay!!!!

Comment author: AnnaSalamon 01 February 2015 08:23:39AM 6 points [-]

Closed for the year at $119,269, which is basically awesome. Thanks, everyone!

Comment author: DanielLC 01 February 2015 08:22:07AM 2 points [-]

Professor Quirrell believes Hermione's death is final. Harry intends to make sure it is not.

Comment author: AnnaSalamon 01 February 2015 07:42:36AM 1 point [-]

18 minutes remaining; $16k of matching funds left unfilled (out of $120k).

Comment author: MarkusRamikin 01 February 2015 07:24:14AM 1 point [-]

Joker Oath? remind me?

Comment author: MotivationalAppeal 01 February 2015 06:26:42AM 3 points [-]

Sometimes I feel uncomfortable talking to strangers, and will put off scheduling appointments. Today, after a few days of trying to beat myself into getting a haircut at a barber's shop or salon, I decided to cut my own hair instead. I'm very pleased with the results, and I will probably make a habit of cutting my own hair from now on. I know this solution doesn't generalize to other appointments, such as medical examinations, but I'm very glad to have put that one source of distress to rest.

In response to comment by G0W51 on My Skepticism
Comment author: dxu 01 February 2015 06:24:13AM 0 points [-]

Believing that the world is stagnant and that the memories one is currently thinking of are false, and that the memory of having more memories is false, seems to be a simple explanation to the universe.

Not in the sense that I have in mind.

"more likely to result in true beliefs."

Unfortunately, this still doesn't solve the problem. You're trying to doubt everything, even logic itself. What makes you think the concept of "truth" is even meaningful?

Comment author: Desrtopa 01 February 2015 06:17:52AM 0 points [-]

That sounds a lot more like a Rowling type twist than an Eliezer type twist. There are elements that could be interpreted as vague and oblique hints, but it doesn't suggest particularly clever or well-considered behavior on anyone's part.

View more: Next