I have significantly decreased my participation on LW discussions recently, partly for reasons unrelated to whatever is going on here, but I have few issues with the present state of this site and perhaps they are relevant:
I'm not trying to spawn new contrarians for the sake of having more contrarians, nor want to encourage debate for the sake of having more disagreements. What I care about is (me personally as well as this community as a whole) having correct beliefs on the topics that I think are most important, namely the core rationality and Singularity-related topics, and I think having more contrarians who disagree about these core topics would help with that. Your suggestion doesn't seem to help with my goals, or at least it's not obvious to me how it would.
(BTW, I note that you've personally made 2 meta/community posts out of 7, whereas I've only made about 3 out of 58 (plus or minus a few counting errors). So maybe you can give me a pass on this one? :)
LW seems to be slowly becoming self-obsessed.
It waxes and wanes. Try looking at all articles labeled "meta"; there were 10(!) in April of 2009 that fit your description of meta-debates (arguing about the karma system, the proper use of the wiki, the first survey, and an Eliezer post about getting less meta).
Granted, that was near the beginning of Less Wrong... but then there was another burst with 5 such articles in April 2010 as well. (I don't know what it is about springtime...) Starting the Discussion area in September 2010 seems to have siphoned most of it off of Main; there have been 3-5 meta-ish posts per month since then (except for April 2011, in which there were 9... seriously, what the hell is going on here?)
LW seems to be slowly becoming self-obsessed.
I don't see how you could possibly be observing that trend. The earliest active comment threads on Less Wrong were voting / karma debates. Going meta is not only what we love best, it's what we're best at, and that's always been so.
Yes, but the real question is why we love going meta. What is it about going meta that makes it worthwhile to us? Some have postulated that people here are actually addicted to going meta because it is easier to go meta than to actually do stuff, and yet despite the lack of real effort, you can tell yourself that going meta adds significant value because it helps change some insight or process once but seems to deliver recurring payoffs every time the insight or process is used again in the future...
...but I have a sneaking suspicion that this theory was just a pat answer that was offered as a status move, because going meta on going meta puts one in a position of objective examination of mere object level meta-ness. To understand something well helps one control the thing understood, and the understanding may have required power over the thing to learn the lessons in the first place. Clearly, ther...
Having more contrarians would be bad for the signal to noise ratio on LW, which is already not as high as I'd like it to be. Can we obtain contrarian ideas more cheaply? For example, one could ask Carl Shulman for a list of promising counterarguments to X, rated by strength, and start digging from there. I'd be pretty interested to hear his responses for X=utilitarianism, the Singularity, FAI, or UDT.
Or designing a mechanism or environment that makes it easier for existent LW contrarians to express their ideas.
(My personal experience is that trying to defend a contrarian position on LW results in a lot of personal cheap shots, unnecessarily-aggressively-phrased counter-affirmations, or needless re-affirmations of the LW consensus. (E.g., I remember one LWer said he was trying to "tar and feather [me] with low-status associations". He was probably exaggerating, but still.) This stresses me out a lot and causes me to make errors in presentation and communication, and needlessly causes me to become adversarial. Now when discussing contrarian topics I start out adversarial in anticipation of personal cheap shots et cetera. Most of the onus is on me, but still, I think higher general standards or some sideways change in the epistemic environment could make constructive contrarianism a less stressful role for LWers to take up.)
(FWIW Vassar, Carl, and Rayhawk (in ascending order of apparent neuroticism) are traditionally most associated with constructing steel men. (Or as I think Vassar put it, "steel men, adamantium men, magnetic monopolium men", respectively.))
Mm, on second reading I think you're right. "Vastly higher quality than anything we would be likely to get from the best contrarians we can find" comes across to me as having too many superlatives to be meant seriously. But "not-sarcastic" fits my model of lukeprog better.
(I was also influenced by it being at -1 when I replied. There's probably a lesson in contrarianism to be taken from that...)
I disagree with quite a lot of the LW consensus, but I haven't really expressed my criticisms in the few comments I've made. I differ substantially from Sequence line on metaethics, reductionism, materialism, epistemology, and even the concept of truth. My views on these things are similar in many respects to those of Hilary Putnam and even Richard Rorty. Those of you familiar with the work of these gentlemen will know how far off the reservation this places me. For those of you who are not familiar with this stuff, I guess it wouldn't be stretch to describe me as a postmodernist.
I initially avoided voicing my disagreements because I suspect that my collection of beliefs is not only regarded as false by this community, but also as a fairly reliable indicator of woolly thinking and a lack of technical ability. I didn't want to get branded right off the bat as someone not worth engaging with. The thought was that I should first establish some degree of credibility within the community by restricting myself to topics where the inferential distance between the average LWer and me is small. I think wannabe contrarians entering into any intellectual community should be encouraged to expe...
There's one tactic that's worked well to get LW posts on neglected topics: having a competition for the best post on a subject. A $100 prize resulted in some excellent posts on efficient charity, and the Quantified Health Prize (substantially more money) led to some good analyses of the data on dietary supplementation.
What about having a contest for the best contrarian post on topic X? Personally, I'd chip in a few bucks for a good contrarian post on intelligence explosion, the mathematical universe, the expected value of x-rationality, and other topics.
(I had this idea after reading this comment, and now that I think of it I'm reminded of ciphergoth's survey of anti-cryonics writing as well.)
Stream of consciousness. Judge me that ye may be judged. If you judge it by first-level Less Wrong standards, it should be downvoted (vague unjustifiied assertions, thoughtlessly rude), but maybe the information is useful. I look first for the heavily downvoted posts and enjoy the responses to them best.
I found the discussion on dietary supplementation interesting, in your link and elsewhere. As I recall, the tendency was for the responses (not entrants, but peoples comments around town) to be both crazy and stupid (with many exceptions, e.g., Yvain, Xacharaiah). I recall another thread on the topic where the correct comment ("careful!") was downvoted and its obvious explanation ("evolution works!") offered afterward was upvoted. Since I detected no secondary reasons for this, it was interesting in implying Less Wrongians did not see the obvious. Low certainties attached since I know I know nothing about this place. I'm deliberately being vague.
In general, Less Wrongians strike me as a group of people of impaired instrumental rationality who are working to overcome it. Give or take, most of you seem to be smarter than average but also less trustworthy,...
A whole lot of Less Wrong seems to be going for less detail, less knowledge, more use of frameworks of universal applicability and little precision. The sequences seem similar to me: Boring where I can judge meaning, meaningless where I can't. And always too long. I've read about four paragraphs of them in total. The quality of conversation here is high for a blog, of course, but low for a good academic setting. Some of the mild sneering at academics around here sounds ridiculous (an AI researcher believes in God). AI's a weak field. All round, papers don't quite capture any field and are often way way behind what people roughly feel.
This. A thousand times this. As a lawyer, LessWrong pattern matches with people outside a complicated field who are convinced that those in the fields are idiots because observers think that "the field is not that complicated."
That said, "Boring where I can judge meaning, meaningless where I can't." is an unfair criticism. Lots of really excellent ideas seem boring if you had already internalized the core ideas.
Reminds me of part of a comment on Moldbug's blot, by Nick Szabo:
...[legal reasoning]
It's a disciplined and competitive (dialectic, in the true original sense of that term) use of analogies, precedents, and emergent rules, far more sophisticated than normal use of analogy and metaphor. I learned it my first year of law school and it's a radically different kind of thinking I had never encountered before in school. The Bayesian bloggers seem to be completely oblivious to it, and to the tremendous value of tradition generally. That makes them, from my POV, culturally illiterate and incompetent to opine on law or politics. Yes, legal training also made me stuck up. :-)
If you can't afford law school, you can learn most of what you need to know from Legal Method and Writing by Charles R. Calleros and a first year law school common law casebook (Torts, Property, or Contracts).
The extremely short description of legal or scholastic reasoning is to think of a proposition or dispute as Schrodinger's Cat, both true and false at the same time, or each party at fault or not at the same time, or the appropriate dichotomy. Then gather all the moral or legal disputes that are similar to this one.
One relevant dynamic is the following: if an idea is considered "absurd" to the mainstream, there will be very few people who take the idea seriously yet disagree with it. Social pressure forces polarization: if you're going to disagree with it, you might as well agree with all your normal friends that the idea is kooky.
Thus it's especially hard to find good contrarians for a forum that takes several "absurd" positions.
Upvote if you generally no longer post or discuss opinions that disagree with LW consensus.
Feel free to leave a comment on your experiences and reasons for this.
(If you would like to downvote this poll, please downvote the karma balance below instead, so that we can still get an accurate idea of the number of people who have this reaction.)
If we have less contrarianism than is optimal, it seems like the root of the problem is that people often vote for agreement rather than for expected added value. I would start looking there for a solution.
Also, the site would be able to absorb more contrarians if their bad contributions didn't cause as much damage. It would help if we exercised better judgment in deciding when a criticism is worth engaging with and when we should just stop feeding the trolls.
Change the mouseovers on the thumbs-up/thumbs-down icons from "Vote up"/"Vote down" to "More like this"/"Less like this". I've suggested this before and it got upvotes, I suggest now it might be time to implement it.
Stupid alternative: Instead of up/down, have blue/green. Let chaos reign as people arbitrarily assign meaning.
Predicted outcome: within a couple of weeks, blue/green will have understood but undocumented positive/negative associations. Votes will be noisier, though, thanks mostly to confused newcomers and the occasional contrarian pursuing an idiosyncratic interpretation. Complaints about downvotes, and color politics jokes, will both become more common.
p = 0.7 contingent on implementation for core claim, .5-6 range for corollaries.
0.7 strikes me as low.
Proposed chaotic refinement: Blue/green, but switch them every 18 to 30 hours (randomly sampled, uniform distribution).
(ETA: Upon reflection days or weeks would be better, to increase chaos/noise ratio. Would also work better with prominent "top contributors for last 30 days" lists for both blue and green, and more adulation/condemnation based on those lists.)
I think of it as "Pay more attention to this" / "Pay less attention to this." Communicating primarily to other readers rather than to posters.
Others already noted that we need contrary opinions more than contrarian people per se. Let me make another distinction. Is the goal a community with a diverse set of opinions, or more people who are vocal and articulate about some minority opinion? Maybe the latter goal is worth working on, but I suspect the former has already been reached. Let me go with myself as an example. I don't think anybody ever saw any of my comments as contrarian, and I am sure nobody associates my nick with contrarianism. The thing is: I would bet against Many Worlds. I am not a consequentialist. I am not really interested in cryonics. I think the flavor of decision theory practiced here is just cool math without foreseeable applications. I give very low probability to FOOM. I think FAI as a goal is unfeasible, for more than one reason.
I am not vocal at all about these positions, and you will very rarely see me engage in loud debates. But I state my position when I feel like it, and I was never punished for that. (I don't have any negatively voted comment out of a few hundred.) I think we would see a similar pattern when checking the positions of other individual "non-contrarian" commenters.
Me too:
I would bet against Many Worlds. I am not a consequentialist. I am not really interested in cryonics. I think the flavor of decision theory practiced here is just cool math without foreseeable applications. I give very low probability to FOOM. I think FAI as a goal is unfeasible, for more than one reason.
I used to be very active on Less Wrong, posting one or two comments every day, and a large fraction of my comments (especially at first) expressed disagreement with the consensus. I very much enjoyed the training in arguing more effectively (I wanted to learn to be more comfortable with confrontation) and I even more enjoyed assimilating the new ideas and perspectives of Less Wrong that I came to agree with.
But after a long while (about two years), I got really, really bored. I visit from time to time just to confirm that, yes, indeed, there is nothing of interest for me here. Well, I'm sure that's no big deal: people have different interests and they are free to come and go.
This is the first post that has interested me in a while, because it gives me a reason to analyze why I find Less Wrong so boring. I would consider myself the type of "reasonable contrarian&q...
I would prefer an increase in 'question' (problem) posts, as opposed to 'statement' (solution) posts, contrarian or no.
Most of the machine intelligence folk don't seem to be on "your" side. I think they see you as potential competitors who don't share their values.
I tend to be more sympathetic to their position than yours. In particular I don't seem to share your values, and don't much like your PR - or your "end of the world" propaganda. I think that developing in secret is a pretty dubious plan - and that the precautionary principle sucks
Probably the best thing about you is that you have Eliezer on your side - and he's a smart cookie. However, that aspect also appears to have its downsides.
It took me much longer than it should have to mentally move you from the "troll" category to the "contrarian" one. That's my fault, but it makes for an interesting case study:
I quickly got irritated that you made the same criticisms again and again, without acknowledging the points people had argued against you each time. To a reader who disagrees with you, that style looks like the work of a troll or crank; to a reader who agrees with you, it's the best that you can do when arguing against someone more eloquent, with a bigger platform, who's gone wrong at some key step.
It should be noted that I don't instinctively think any more highly of contrarians who constantly change their line of attack; it seems to be a "damned if you do, damned if you don't" tribal response.
The way I changed my mind was that you made an incisive comment about something that wasn't part of your big disagreement with the Less Wrong community, and I was forced to update. For any would-be respected contrarians out there, this might be a good tactic to circumvent our natural impulse towards closing ranks.
I know basically nothing about modern Catholics, actually, which is a big reason why I haven't yet converted. E.g. I have serious doubts about the goodness of the Second Vatican Council. If the Devil has seriously tainted the temporal Church then I want no part in it.
Considering this among other things, I want to see the contrarian awesomeness that would be you writing a series of posts on the Orthospehere explaining your positions and theories regarding the Church and global history.
Regardless if this turned to be an epic troll or the birth of a new cult, it would be extremely entertaining.
Perhaps we have this backwards?
If there is something intrinsically valuable about controversy (and I'm not really sure that there is, but I'm willing to accept the premise for the sake of discussion), and we're not getting the optimal level of controversy on the topics we normally discuss (again, not sure I agree, but stipulated), then perhaps what we should be doing is not looking for "more and better contrarians" who will disagree with us on the stuff we have consensus on, but rather starting to discuss more difficult topics where there is less consensus.
One problem is, of course, that some of us are already worried that LW is too weird-sounding and not sufficiently palatable to the mainstream, for example, and would probably be made uncomfortable if we explore more controversial stuff... it would feel too much like going to school in a clown suit. And moving from areas of strength to areas of weakness is always a little scary, and some of us will resist the transition simply for that reason. And many more.
Still, if you can make a case for the value of controversy, you might find enough of us convinced by that case to make that transition.
Here's a case for the value of controversy.
In other words, even if you believe that each item of LessWrong consensus is almost certain to be correct, you should still be doubtful that every item of LessWrong consensus is likely to be correct. And if there are significant errors, then how else will they be found and publicized other than via a controversial discussion?
Idea- Using Contrary Opinions as a Group Rationality Exercise
Sometimes when I'm discussing issues one-on-one with someone of a different opinion, I will find myself treating arguments as soldiers (I am improving on catching myself in this, I think.). I can also have difficulties verbalizing what is wrong with an argument when put on the spot.
Maybe we can use "Devil's Advocating" posts as a group exercise in rationality. Someone can read or summarize a specific opposing viewpoint that they do not necessarily agree with (maybe subjectivism, or Kuhn's scientific revolutions). They could hopefully even get completely new material, in order to provide practice in a field we haven't discussed.
They will present the strongest summary they can in a post, writing as if they fully supported the idea. The tag [Devil's Advocating] can be used to show that this is what they are doing.
One comment thread can be devoted to finding arguments that the viewpoint covers strongly. (i.e. maybe subjectivism handles a specific question a little better than most other philosophies, or maybe Kuhn's revolutions provide a better explanation of the different types of science that scientists engage in...
I would love to be better at contrarianism, but I don't know where to begin.
I got where I am today mostly through trial and error.
The General Contrarian Heuristic:
Assume these and such people who claim to be right actually are at-least-somewhat-straightforwardly right, and they have good evidence or arguments that you're just not aware of. (There are many plausible reasons for your ignorance; e.g. for the longest time I thought Christianity and ufology were just obviously stupid, because I'd only read atheist/skeptic/scientismist diatribes. What evidence filtered evidence?) What is the most plausible evidence or argument that can be found while searching in good faith? This often splits in two directions:
May we not forget interpretations consistent with the evidence, even at the cost of overweighting them."
Upvoted. The easiest way to get the wrong answer is to never have considered the right answer.
I've always thought that imagination belonged on the list of rationalist virtues.
For comparison, the General Chess Heuristic: Think about a move you could make, think about the moves your opponent could make in reply, think about what moves you could make if they replied with any of those candidate moves, &c.; evaluate all possible resultant positions, subject to search heuristics and time constraints.
What's interesting is that novice chess players reliably forget to even consider what moves their opponent could make; their thought process barely includes the opponent's possible thought process as a fundamental subroutine. I think novice rationalists make the same error (where "opponent" is "person or group of people who disagree with me"), and unfortunately, unlike in chess, they don't often get any feedback alerting them to their mistake.
(Interestingly, Roko once almost defeated me in chess despite having significantly less experience than me, because he just thought really hard and reliably calculated a ton of lines. I'd never seen anyone do that successfully, and was very impressed. I would've lost except he made a silly blunder in the endgame. He who has ears to hear, let him hear.)
Any extreme minority position would take a long time to win converts. People are generally wrong because they have bad concepts, not because they have clear concepts, but mistakenly thought 2+2=5.
It takes a while to penetrate poor concepts, and the people with poor concepts have to be willing to put in the effort to justify their argument, and not just take it as a given that is up to someone else to refute their nonsense, because you can't refute gibberish. Most people here are intellectually confident. Add to that the consensus of the group, and who is ...
Maybe we could have a "contrarian of the month" award? This could also encourage normally agreeable Less Wrong users to argue against consensus positions in hopes of winning the award.
Maybe we could have a "contrarian of the month" award?
Can we please not do this? I already feel a pre-emptive contrarian outrage against whatever consensus is arrived at when awarding this "official contrarily" award. Then I start thinking of court Jesters. This is a way to get people to think in the predetermined 'outside the box box' and change their 'mainstream' uniform to the 'rebel' uniform. That's not the way to get useful contrarians.
This could also encourage normally agreeable Less Wrong users to argue against consensus positions in hopes of winning the award.
You're advocating this as a good thing?
I would change the rules to go something like this: Write a one sentence summary of your conclusion first, in as shocking terms as possible. Get people to vote up or down based on whether they agree with the initial one sentence summary. Then you justify the one sentence summary in subsequent paragraphs, which might cause folks to change their mind. That way we could get novel but possibly true beliefs in addition to irrational beliefs at the top.
Or rethink the game entirely along these lines so it is the "More Plausible Than I Initially Thought Game", so we don't get things like UFOs at the top. Participants upvote those comments that cause the maximum change to their beliefs, especially by making something surprising seem at least vaguely plausible. I dislike the current game rules somewhat because it seems like a signaling fest.
Awarded to a nonconformist in black or a nonconformist in a clown suit? The latter is likely to get the tone argument (where someone's claimed rejection is the tone of the statement rather than its content).
Suggestion: whenever you're tempted to respond with a tone argument ("stop being so rude/dismissive/such a flaming arsehole/etc"), try really hard to respond to the substance as if the tone is lovely. The effort will net you upvotes ;-)
I don't like contrarians, but I think honest and fundamental dissent is vital.
A recent development in applied psychology is that small incentives can have large consequences. I think the upvote/downvote ratio is underestimated in importance. The ratio currently is obviously greater than 1; I don't know how much greater. (Who does?) This creates an asymmetry in which below zero, each downvote has disproportionate stigmatizing power, creating an atmosphere of apprehension among dissenters. The complexion of postings might change if downvoting and upvoting r...
We need a handy way of saying "Yes I understand the standard arguments for P but I still think it's worth your while considering this argument for ¬P rather than just telling me the standard arguments for P."
Unfortunately it may be that the only credible signal of this is to first outline the standard arguments for P.
I think the kind of people you're looking for are rare in general, so it shouldn't be a surprise that they are rare on LW.
That said, there's room for improvement. The karma system only allows for one kind of vote. It could be more like Slashdot and allow for tagging of the vote, or better yet allow for up/down voting in several different categories. If a comment is IMO well worded, clear, logical, and dead wrong, then it's probably worth reading, but not worth believing. Right now all I can do is vote it up or down. I'd like to be able to vote for clar...
I think we can see now how the situation evolved: SI ignored what 'contrarians' (the mainstream) said, the views they formed after reading SI's arguments, etc.
SI then gone to talk to GiveWell, and the presentation resulted in Holden forming same view - if you strip his statement down to bare bones he says that he thinks giving money to SI results in either no change, or increase of the risk, as the approach SI advocates is more dangerous than current direction, and the rationale given has already been available (but has been ignored).
Ultimately, it may b...
haven't read yet but you can start by not calling anyone who disagrees with the established view a contrarian. It implies anyone who disagrees is doing so to play out a role rather than out of actual disagreement.
edit: so it seems that people who are playing out a role is exactly what you want more of. I assumed you were using "how can we get more contrarians" as codespeak for how can we get more disagreement. If you just want more actual "contrarians", well, I'm not sure "contrarians" is a real category. In any case it's not ...
I don't see a problem with driving "contrarians" away. That is what we should be doing.
To be a "contrarian" is to have written a bottom line already: disagree with everything everyone else agrees with.
To be a "contrarian" among smart people is to adopt reversed intelligence as a method of intelligence.
To be a "contrarian" among stupid people is, like American football, something that you have to be smart enough to do but stupid enough to think worth doing.
To be a "contrarian" is to limit oneself to writing ...
Yes, being a "contrarian" is irrational for the individual, but may be good for the group. I wouldn't try to turn someone into a "contrarian" for my own benefits, but I don't feel qualms about making better use of people who already are.
This could be rephrased more positively :D
If someone has something they may well be right about, and you don't learn it, that's a problem. Or if they make an argument that you know is wrong from parallel lines of evidence but can't say why it's wrong, that's a slightly smaller problem. And it's a problem with you, not with them. This is a general principle of disagreement. This post is the charge that we are bad at learning from people.
Hmm. Or maybe that's not right. We could be learning from them (on average), but still driving them away because what seemed like constructive argument from one side didn't from the other. In which case, that's fine and you shouldn't listen to this comment :P
It's so difficult to find someone who will communicate on our level and yet disagrees on object-level things.
Probably the best way to get more contrarians, is for folks from Less Wrong to learn from people outside the community, change their own beliefs because of it, and come back to share their wisdom with the masses.
Okay, that sounded better in my head too.
It's so difficult to find someone who will communicate on our level and yet disagrees on object-level things.
Is this because people smart enough to communicate on our level largely agree with a lot of what is generally agreed on here, for the same reason that most people all agree that 2+2=4?
Or is it because LessWrong is, for reasons unconnected with rationality, largely drawn from a certain very narrow demographic range, who grab onto this constellation of ideas like an enzyme to its substrate, and "communicating on our level" just means being that sort of person?
To what degree should the lack of good contrarians be taken as evidence that LW "consensus" (scare quotes because the like-mindedness of this community is overestimated [1]) is true?
People are always talking about how the Less Wrong arguments are good viewed from the inside but not the outside, so this question is important as it is an outside-view consideration that, unlike most others, strikes favorably on the Less Wrong mentality, which is usually only justified inside the arguments.
Not only that, but in an uninformative and confrontational manner, posing the problem of how to respond to generate better contrarianism.
Of course what is optimal might be open to debate, but from my perspective, it can't be right that my own criticisms are valued so highly (especially since I've been moving closer to the SingInst "inner circle" and my critical tendencies have been decreasing). In the spirit of making oneself redundant, I'd feel much better if my occasional voice of dissent is just considered one amongst many.
Tangentially related: I was in the HPMOR thread and noticed that there's a strong tendency to reward good answers but only a weak tendency to reward good questions. The questions are actually more important than the answers because they're a prerequisite to the answers, but they don't seem to be being treated as such. They have roughly half as much reputation as the popular answers do, which seems unfair.
I would guess that this extends to the rest of the site as well, as it's a fairly common thing that humans do. Things would probably be better here if we ...
I've noticed there's been a dozen or more threads and suggestions like this one; has anything ever come from them? These suggestions are starting to look like simple opportunities for circle jerking. Who would even decide on and implement these things? Yudkowsky?
Somebody who is right does not need a contrarian that badly. Someone who is wrong needs one. But just everybody thinks how his contrarian is not a particularly good one,
I'm worried that LW doesn't have enough good contrarians and skeptics, people who disagree with us or like to find fault in every idea they see, but do so in a way that is often right and can change our minds when they are. I fear that when contrarians/skeptics join us but aren't "good enough", we tend to drive them away instead of improving them.
For example, I know a couple of people who occasionally had interesting ideas that were contrary to the local LW consensus, but were (or appeared to be) too confident in their ideas, both good and bad. Both people ended up being repeatedly downvoted and left our community a few months after they arrived. This must have happened more often than I have noticed (partly evidenced by the large number of comments/posts now marked as written by [deleted], sometimes with whole threads written entirely by deleted accounts). I feel that this is a waste that we should try to prevent (or at least think about how we might). So here are some ideas: