So this is Utopia, is it?  Well
I beg your pardon, I thought it was Hell.
        -- Sir Max Beerholm, verse entitled
        In a Copy of More's (or Shaw's or Wells's or Plato's or Anybody's) Utopia

This is a shorter summary of the Fun Theory Sequence with all the background theory left out - just the compressed advice to the would-be author or futurist who wishes to imagine a world where people might actually want to live:

  1. Think of a typical day in the life of someone who's been adapting to Utopia for a while.  Don't anchor on the first moment of "hearing the good news".  Heaven's "You'll never have to work again, and the streets are paved with gold!" sounds like good news to a tired and poverty-stricken peasant, but two months later it might not be so much fun.  (Prolegomena to a Theory of Fun.)
  2. Beware of packing your Utopia with things you think people should do that aren't actually fun.  Again, consider Christian Heaven: singing hymns doesn't sound like loads of endless fun, but you're supposed to enjoy praying, so no one can point this out.  (Prolegomena to a Theory of Fun.)
  3. Making a video game easier doesn't always improve it.  The same holds true of a life.  Think in terms of clearing out low-quality drudgery to make way for high-quality challenge, rather than eliminating work.  (High Challenge.)
  4. Life should contain novelty - experiences you haven't encountered before, preferably teaching you something you didn't already know.  If there isn't a sufficient supply of novelty (relative to the speed at which you generalize), you'll get bored.  (Complex Novelty.)
  1. People should get smarter at a rate sufficient to integrate their old experiences, but not so much smarter so fast that they can't integrate their new intelligence.  Being smarter means you get bored faster, but you can also tackle new challenges you couldn't understand before.  (Complex Novelty.)
  2. People should live in a world that fully engages their senses, their bodies, and their brains.  This means either that the world resembles the ancestral savanna more than say a windowless office; or alternatively, that brains and bodies have changed to be fully engaged by different kinds of complicated challenges and environments.  (Fictions intended to entertain a human audience should concentrate primarily on the former option.)  (Sensual Experience.)
  3. Timothy Ferris:  "What is the opposite of happiness?  Sadness?  No.  Just as love and hate are two sides of the same coin, so are happiness and sadness...  The opposite of love is indifference, and the opposite of happiness is - here's the clincher - boredom...  The question you should be asking isn't 'What do I want?' or 'What are my goals?' but 'What would excite me?'...  Living like a millionaire requires doing interesting things and not just owning enviable things."  (Existential Angst Factory.)
  4. Any particular individual's life should get better and better over time.  (Continuous Improvement.)
  5. You should not know exactly what improvements the future holds, although you should look forward to finding out.  The actual event should come as a pleasant surprise.  (Justified Expectation of Pleasant Surprises.)
  6. Our hunter-gatherer ancestors strung their own bows, wove their own baskets and whittled their own flutes; then they did their own hunting, their own gathering and played their own music.  Futuristic Utopias are often depicted as offering more and more neat buttons that do less and less comprehensible things for you.  Ask not what interesting things Utopia can do for people; ask rather what interesting things the inhabitants could do for themselves - with their own brains, their own bodies, or tools they understand how to build.  (Living By Your Own Strength.)
  7. Living in Eutopia should make people stronger, not weaker, over time.  The inhabitants should appear more formidable than the people of our own world, not less.  (Living By Your Own Strength; see also, Tsuyoku Naritai.)
  8. Life should not be broken up into a series of disconnected episodes with no long-term consequences.  No matter how sensual or complex, playing one really great video game after another, does not make a life story.  (Emotional Involvement.)
  9. People should make their own destinies; their lives should not be choreographed to the point that they no longer need to imagine, plan and navigate their own futures.  Citizens should not be the pawns of more powerful gods, still less their sculpted material.  One simple solution would be to have the world work by stable rules that are the same for everyone, where the burden of Eutopia is carried by a good initial choice of rules, rather than by any optimization pressure applied to individual lives.  (Free to Optimize.)
  10. Human minds should not have to play on a level field with vastly superior entities.  Most people don't like being overshadowed.  Gods destroy a human protagonist's "main character" status; this is undesirable in fiction and probably in real life.  (E.g.:  C. S. Lewis's Narnia, Iain Banks's Culture.)  Either change people's emotional makeup so that they don't mind being unnecessary, or keep the gods way off their playing field.  Fictional stories intended for human audiences cannot do the former.  (And in real life, you probably can have powerful AIs that are neither sentient nor meddlesome.  See the main post and its prerequisites.)  (Amputation of Destiny.)
  11. Trying to compete on a single flat playing field with six billion other humans also creates problems.  Our ancestors lived in bands of around 50 people.  Today the media is constantly bombarding us with news of exceptionally rich and pretty people as if they lived next door to us; and very few people get a chance to be the best at any specialty.  (Dunbar's Function.)
  12. Our ancestors also had some degree of genuine control over their band's politics.  Contrast to modern nation-states where almost no one knows the President on a personal level or could argue Congress out of a bad decision.  (Though that doesn't stop people from arguing as loudly as if they still lived in a 50-person band.)  (Dunbar's Function.)
  13. Offering people more options is not always helping them (especially if the option is something they couldn't do for themselves).  Losses are more painful than the corresponding gains, so if choices are different along many dimensions and only one choice can be taken, people tend to focus on the loss of the road not taken.  Offering a road that bypasses a challenge makes the challenge feel less real, even if the cheat is diligently refused.  It is also a sad fact that humans predictably make certain kinds of mistakes.  Don't assume that building more choice into your Utopia is necessarily an improvement because "people can always just say no".  This sounds reassuring to an outside reader - "Don't worry, you'll decide!  You trust yourself, right?" - but might not be much fun to actually live with.  (Harmful Options.)
  14. Extreme example of the above: being constantly offered huge temptations that are incredibly dangerous - a completely realistic virtual world, or very addictive and pleasurable drugs.  You can never allow yourself a single moment of willpower failure over your whole life.  (E.g.:  John C. Wright's Golden Oecumene.)  (Devil's Offers.)
  15. Conversely, when people are grown strong enough to shoot off their feet without external help, stopping them may be too much interference.  Hopefully they'll then be smart enough not to:  By the time they can build the gun, they'll know what happens if they pull the gun, and won't need a smothering safety blanket.  If that's the theory, then dangerous options need correspondingly difficult locks.  (Devil's Offers.)
  16. Telling people truths they haven't yet figured out for themselves, is not always helping them.  (Joy in Discovery.)
  17. Brains are some of the most complicated things in the world.  Thus, other humans (other minds) are some of the most complicated things we deal with.  For us, this interaction has a unique character because of the sympathy we feel for others - the way that our brain tends to align with their brain - rather than our brain just treating other brains as big complicated machines with levers to pull.  Reducing the need for people to interact with other people reduces the complexity of human existence; this is a step in the wrong direction.  For example, resist the temptation to simplify people's lives by offering them artificially perfect sexual/romantic partners. (Interpersonal Entanglement.)
  18. But admittedly, humanity does have a statistical sex problem: the male distribution of attributes doesn't harmonize with the female distribution of desires, or vice versa.  Not everything in Eutopia should be easy - but it shouldn't be pointlessly, unresolvably frustrating either.  (This is a general principle.)  So imagine nudging the distributions to make the problem solvable - rather than waving a magic wand and solving everything instantly.  (Interpersonal Entanglement.)
  19. In general, tampering with brains, minds, emotions, and personalities is way more fraught on every possible level of ethics and difficulty, than tampering with bodies and environments.  Always ask what you can do by messing with the environment before you imagine messing with minds.  Then prefer small cognitive changes to big ones.  You're not just outrunning your human audience, you're outrunning your own imagination.  (Changing Emotions.)
  20. In this present world, there is an imbalance between pleasure and pain.  An unskilled torturer with simple tools can create worse pain in thirty seconds, than an extremely skilled sexual artist can create pleasure in thirty minutes.  One response would be to remedy the imbalance - to have the world contain more joy than sorrow.  Pain might exist, but not pointless endless unendurable pain.  Mistakes would have more proportionate penalties:  You might touch a hot stove and end up with a painful blister; but not glance away for two seconds and spend the rest of your life in a wheelchair.  The people would be stronger, less exhausted.  This path would eliminate mind-destroying pain, and make pleasure more abundant.  Another path would eliminate pain entirely.  Whatever the relative merits of the real-world proposals, fictional stories cannot take the second path.  (Serious Stories.)
  21. George Orwell once observed that Utopias are chiefly concerned with avoiding fuss.  Don't be afraid to write a loud Eutopia that might wake up the neighbors.  (Eutopia is Scary; George Orwell's Why Socialists Don't Believe in Fun.)
  22. George Orwell observed that "The inhabitants of perfect universes seem to have no spontaneous gaiety and are usually somewhat repulsive into the bargain."  If you write a story and your characters turn out like this, it probably reflects some much deeper flaw that can't be fixed by having the State hire a few clowns.  (George Orwell's Why Socialists Don't Believe in Fun.)
  23. Ben Franklin, yanked into our own era, would be surprised and delighted by some aspects of his Future.  Other aspects would horrify, disgust, and frighten him; and this is not because our world has gone wrong, but because it has improved relative to his time.  Relatively few things would have gone just as Ben Franklin expected.  If you imagine a world which your imagination finds familiar and comforting, it will inspire few others, and the whole exercise will lack integrity.  Try to conceive of a genuinely better world in which you, yourself, would be shocked (at least at first) and out of place (at least at first).  (Eutopia is Scary.)
  24. Utopia and Dystopia are two sides of the same coin; both just confirm the moral sensibilities you started with.  Whether the world is a libertarian utopia of government non-interference, or a hellish dystopia of government intrusion and regulation, you get to say "I was right all along."  Don't just imagine something that conforms to your existing ideals of government, relationships, politics, work, or daily life.  Find the better world that zogs instead of zigging or zagging.  (To safeguard your sensibilities, you can tell yourself it's just an arguably better world but isn't really better than your favorite standard Utopia... but you'll know you're really doing it right if you find your ideals changing.)  (Building Weirdtopia.)
  25. If your Utopia still seems like an endless gloomy drudgery of existential angst no matter how much you try to brighten it, there's at least one major problem that you're entirely failing to focus on.  (Existential Angst Factory.)
  26. 'Tis a sad mind that cares about nothing except itself.  In the modern-day world, if an altruist looks around, their eye is caught by large groups of people in desperate jeopardy.  People in a better world will not see this:  A true Eutopia will run low on victims to be rescued.  This doesn't imply that the inhabitants look around outside themselves and see nothing.  They may care about friends and family, truth and freedom, common projects; outside minds, shared goals, and high ideals.  (Higher Purpose.)
  27. Still, a story that confronts the challenge of Eutopia should not just have the convenient plot of "The Dark Lord Sauron is about to invade and kill everybody".  The would-be author will have to find something slightly less awful for his characters to legitimately care about.  This is part of the challenge of showing that human progress is not the end of human stories, and that people not in imminent danger of death can still lead interesting lives.  Those of you interested in confronting lethal planetary-sized dangers should focus on present-day real life.  (Higher Purpose.)

The simultaneous solution of all these design requirements is left as an exercise to the reader.  At least for now.

The enumeration in this post of certain Laws shall not be construed to deny or disparage others not mentioned.  I didn't happen to write about humor, but it would be a sad world that held no laughter, etcetera.

To anyone seriously interested in trying to write a Eutopian story using these Laws:  You must first know how to write.  There are many, many books on how to write; you should read at least three; and they will all tell you that a great deal of practice is required.  Your practice stories should not be composed anywhere so difficult as Eutopia.  That said, my second most important advice for authors is this:  Life will never become boringly easy for your characters so long as they can make things difficult for each other.

Finally, this dire warning:  Concretely imagining worlds much better than your present-day real life, may suck out your soul like an emotional vacuum cleaner.  (See Seduced by Imagination.)  Fun Theory is dangerous, use it with caution, you have been warned.

New Comment
36 comments, sorted by Click to highlight new comments since:
[-]Jon_R160

In the interests of accuracy, I'd like to talk about the Christian Heaven. Though I now consider myself an agnostic, I went to two years of bible college (think the Fundamentalist version of seminary). To the best of my recollection, the only substantial description of Heaven appears in the last two chapters of Revelation, a book that even in Fundamentalist circles is acknowledged to contain a lot of symbolism.

There are two parts to this description. The first (Rev 21:3-7) talks about what God is going to do in Heaven: "He will wipe every tear from their eyes. There will be no more death or mourning or crying or pain, for the old order of things has passed away." and so on.

The second part (Rev 21:10-22:5) discusses the appearance of the city in a manner that nearly all theologians would interpret to be symbolic. The city has walls of jasper (God's appearance is previously described as jasper, earlier in the book), and is built of pure gold (a reference to the purity of the inhabitants; the book earlier describes the trials they've gone through as purification, so that all the dross would have run out and only pure gold remains). The numbers given are all based on 12 -- both the number of the tribes of Israel, and the number of the apostles. Likewise, the listing of gemstones for the foundation is derived from the list of jewels representing the twelve tribes in the high priest's breastplate as described in Exodus. There's more I could say here, but I doubt you care too much; suffice it to say that the whole thing's symbolic.

The "singing hymns" part actually comes earlier in the book, in Revelation 15, where the apocalypse is still occurring. There's no mention of it in the last two chapters, and certainly no mention of that being all you do forever.

There isn't actually a lot of description of Heaven in the Bible, perhaps for good reason. Apart from these two chapters, the only other stuff we have to go on is some sayings of Jesus in the Gospels. Nothing that I recall about "you'll never have to work again", though there was a lot about giving rest to those who are weary and heavy laden (and if that's not a good description of a peasant's life, I don't know what is).

My point for bringing this up isn't to convince you that the Christian Heaven is great -- as I said, I don't believe in it myself anymore. Rather, I find that people typically make better arguments when they actually know what they're talking about, and it may assist you in railing against the Christian Heaven to know what it actually is said to be.

Your post explains how the bible describes heaven. However, when I hear the phrase “Christian heaven” I tend to take it to mean “heaven as Christians today understand it”. You may well be right that the bible doesn’t directly imply that it includes singing hymns for the rest of eternity, but clearly it is widely imagined that way, otherwise we wouldn’t all have heard that idea.

[-][anonymous]50

It's an often described caricature of heaven but I imagine that most believers would say that heaven isn't actually like that, and possibly add something about how the things a soul experiences in heaven are beyond mortal comprehension.

I think you may have been giving them too much credit. Here's an adherent explaining that wireheading is a bad thing, but in heaven, wireheading is good because everything in heaven is good.

I don't think people don't always put much effort into critically considering their beliefs.

I had an idea for a sort of Christian fanfiction, in which people marked for heaven and people marked for hell both go into the same firey pit, but the former are wireheaded to be happy about it. It's a far more efficient construction that way. (I suppose you could also do the reverse, with the people marked for hell being reverse-wireheaded to find nice things agonizing, but that doesn't have the same tasty irony.)

That's the standard Eastern Orthodox doctrine: everybody goes to heaven, but only those who love God will enjoy it.

Fascinating!

These theological symbols, heaven and hell, are not crudely understood as spatial dimensions but rather refer to the experience of God's presence according to two different modes.

(I suppose you could also do the reverse, with the people marked for hell being reverse-wireheaded to find nice things agonizing, but that doesn't have the same tasty irony.)

http://en.wikipedia.org/wiki/A_Nice_Place_to_Visit

"Ben Franklin, yanked into our own era, would be surprised and delighted by some aspects of his Future. Other aspects would horrify, disgust, and frighten him; and this is not because our world has gone wrong, but because it has improved relative to his time."

How do we know that it's improved? Isn't it equally plausible that Franklin would be horrified because some things in our world are horrifying, and his own moral thinking was more rational than our own? Does moral thought gets more rational all on its own? It seems as though it might be difficult for moderns to know if moral thought were less rational than it used to be.

Hmm... I don't think humanity's terminal values have changed very much since Benjamin Franklin (matter of fact, he was an Enlightenment figure, and the enlightenment is probably the most recent shift of terminal values in the Western world: political liberty, scientific truth, etc.) The things that I imagine would horrify him are mostly either actually bad (Global warming! Nuclear bombs!) or a result of cultural taboos or moral injucntions that have been lifted since his time (Gay marriage! Barack Obama!). This, it seems, is what we mostly mean by moral progress: The lifting of {cultural taboos/moral injunctions} which inhibit our terminal moral values.

The holders of those taboos/injunctions likely considered them part of their terminal moral values.

I find it interesting that you list law number 11 and 12 next to each other (and both so close to 15) without seeming to connect their contents. You seem to be assuming that nothing can carry over from one virtual experience to another, and that these experiences cannot be discussed, learned from, etc., nor can they be competitive or cooperative activities with multiple users. You also deny that creating such an experience is very rewarding and challenging. Perhaps you should stick to disparaging orgasmium and not interactive games and art.

How do we know that it's improved? Isn't it equally plausible that Franklin would be horrified because some things in our world are horrifying, and his own moral thinking was more rational than our own? Does moral thought gets more rational all on its own? It seems as though it might be difficult for moderns to know if moral thought were less rational than it used to be.

The point of the exercise (somewhat more clear in the full post) is not that every moral decision on which we differ with Ben Franklin represents a moral improvement, but that at least some do and there are many. So, there are many things about our world today that are, in fact, better than the world of the 1700s, and at least some of them would nonetheless shock or horrify someone like Ben Franklin, at least at first, even if he could ultimately be convinced wholly that they are an improvement.

So in designing any real utopia, we have to include things that are different enough to horrify us at first glance. We have to widen our scope of acceptable outcomes to include things with an argument to be better that would horrify us. And that will, in fact, potentially include outcomes that hearken back to previous times, and things that Ben Franklin (or any other rational person of the past) might consider more comforting than we would.

I think your number 7 is wrong. Love and hate is a continuum with indifference in the middle - considering people for example it is a bell curve with a few you love, a few you hate, and most you are indifferent to. Similarly with happiness, boredom, and sadness, though this is probably harder to visualize for the non-bipolar.

[-]g20

Beerholm --> Beerbohm, surely? (On general principles; I am not familiar with the particular bit of verse Eliezer quoted.)

consider Christian Heaven: singing hymns doesn't sound like loads of endless fun

Unless, perhaps, you happen to enjoy music...

(Seriously -- suppose you got to compose your own hymns.)

A general comment: I am tempted to question the wisdom of tying Fun Theory so closely to the aesthetics of storytelling, by discussing the two in such proximity. As we all know, there's not necessarily any correlation between the worlds we would want to live in and the worlds we like to read about. I'm not just talking about Dystopian stories either. I love watching House, but sure as hell would never want to actually be any of the characters on that show. Shakespeare's Romeo and Juliet is a delight to read despite being (paradoxically) a depressing tragedy. Etc.

Now there is a connection, to be sure, in that aesthetics itself (in the context of any art form, including but not limited to storytelling) is effectively a miniature, special case of Fun Theory. But this connection is more abstract, and has little to do with how closely settings and plots match up with eudaimonic scenarios. (Inhabitants of Eutopia themselves may enjoy tragic stories and the like.)

[-][anonymous]20

Unless, perhaps, you happen to enjoy music...

(Seriously -- suppose you got to compose your own hymns.)

Music is actually a good big mine of fun (thought naturally it is finite). We seem to need longer to tire of the same piece of music than of say a movie or a book.

Michael, you're certainly right that the broader point stands. We should expect a real utopia initially to horrify us in some of its particulars (that said, many famous literary utopias pass this test with ease - horror is a necessary but entirely insufficient criterion, apparently). It nonetheless seems blithe to assert that our world's 'improvement' fully explains Franklin's hypothesized disgust. The improvements that seem least ambiguous, such as vaccines, would be those least likely to disgust him.

[-]Dagon-10

These don't seem very universal to me. I think a whole lot of people would choose to live in a world that violates some or all, especially if they're also allowed to change themselves to enjoy it more.

Rule 23 seems especially strange. Isn't modification of cognitive ability and structure what the singularity is all about?

For what it's worth, I've always enjoyed stories where people don't get hurt more than stories where people do get hurt. I don't find previously imagined utopias that horrifying either.

Dagon, as I explained in Interpersonal Entanglement, it's okay except when it isn't.

"You can never allow yourself a single moment of willpower failure over your whole life. (E.g.: John C. Wright's Golden Oecumene.)"

In the Golden Oecumene, of course, we are positing a technology that can rewrite and rewire the human consciousness any way whatsoever. You can nip down to the corner store and buy yourself an iron willpower.

As I recall (I haven't read the book recently) there was a legal form called a 'werewolf contract' a person could sign so that someone else with power of attorney could be authorized to override the citizen's self-sovereignty in cases defined in the contract (such as if I am afraid I accidentally might turn myself into a werewolf by toying with my own cerebrum-rewriting program).

Also, every thousand years all minds in the system were interlinked in a Grand Transcendence, an attempt to achieve an ultimate level of intellect beyond human or machine consciousness. It was not explicitly stated, but the books implied that participation was mandatory: one of the characters is lost in one of these 'devil bargains' you mention, and she is against her will pulled out to mingle with the transcendent consciousness, and review her life, so that she must again decide to return to her amnesia illusion. This may have been required by law, or, more likely, it was something signed as a private contract when the character was wired up to be able to form full-immersion brain interfaces. The book doesn't say.

Sorry. Didn't mean to go on about the example: the main point is that if you have the technology and social customs which allow for the devil bargain type temptations mentioned above, does not this same level of technology imply that mechanisms will be discovered by those concerned to counteract the threat?

JCW, first, are you actually John C. Wright or just posting an objection on his behalf? If the latter, I don't think it's quite appropriate to take on his actual name.

This issue was discussed in the comments of Devil's Offers. Yes, you can have a Werewolf contract, and you can have a Jubilee, and you can have various other rules... but some people didn't take on Werewolf contracts out of pride, because e.g. Helion felt it would reduce him to the status of a child. There's also the question of why a decision your earlier self made, should bind all future mind-states forever - could you sign a contract refusing to ever change your mind about libertarianism, say?

Well, if everyone does in fact sign a Werewolf contract and that works - in reality or in fiction - that's fine enough; then you're not living with the everlasting presence of incredible temptations. But the absence of a signed Werewolf contract made these incredible temptations and momentary failures of willpower a key plot point in the Golden Oecumene - a necessary flaw in the Utopia, without which there would have been no plot.

And my own point is that you may just be better off not introducing the poisons to begin with. What people can do to themselves through their own strength is one matter; what they can sell to one another, is another matter; and what superintelligences or agents of a higher order (including a corporation of experienced neurologists selling to a lone eighteen-year-old) can offer in the way of far-outrange temptations is yet another.

While we're on the subject of the Golden Ecumene, I've only read the first book but the major problem I had with it was the egregious violation of your Rule 31 in its twist ending (which frankly seemed tacked-on, as if the publisher had said "Do you think you could turn this book into a trilogy somehow?").

We live in utopia, and the fun is no one remembers.

[-]ata20

The world is not having nearly as much fun as it could be. We need a New World Order of Fun.

Well it's good to see that if you somehow found a way to implement your ideas you would at least do it well.

[This comment is no longer endorsed by its author]Reply

Finally, this dire warning: Concretely imagining worlds much better than your present-day real life, may suck out your soul like an emotional vacuum cleaner. (See Seduced by Imagination.) Fun Theory is dangerous, use it with caution, you have been warned.

An obvious application of Fun Theory is its use in designing virtual worlds that suck out other people's souls for fun and profit! Then donating that to optimal charity to make up for disutility. Enslave the irrational for the greater good!

[-][anonymous]140

How do you manage to make being a good author or game designer interested in philanthropy sound so ... evil?

[-]gwern120

"On the surface of the problem, television is responsible for our rate of its consumption only in that it's become so terribly successful at its acknowledged job of ensuring prodigious amounts of watching. Its social accountability seems sort of like that of designers of military weapons: unculpable right up until they get a little too good at their job."

--David Foster Wallace, "E unibus pluram"

Offering a road that bypasses a challenge makes the challenge feel less real, even if the cheat is diligently refused.

I think this is good to keep in mind, because it is often true, but it is not always true. How arbitrary the initial challenge was certainly has an effect. For instance, in the case of video game speedrunning, I think we can say that the fact that tool-assisted speedrunning exists doesn't make the challenge of human-performed speedruns feel less real.

Edit: I should clarify this, since I expect speedrunning as a whole seems kind of arbitrary, which may have made it seem like I was saying "more arbitrary things will still seem real", when that was the opposite of what I meant. The point is that the restriction of "ah, but a human has to do it, with an actual controller, on the original hardware", is a pretty natural restriction to add. (At least, while the distinction of human-vs.-computer remains a natural one.)

To anyone seriously interested in trying to write a Eutopian story using these Laws: You must first know how to write. There are many, many books on how to write; you should read at least three; and they will all tell you that a great deal of practice is required.

Any particular recommendations? Anyone?

I presume the purpose of a utopia is to optimize the feelings of all those in it. This could be done in two main ways: (1) directly, by acting upon the emotional/sensory processing areas of the brain, or (2) indirectly, by altering the environment. This article seems to assume that (2) is the only way to create a utopia. I understand that some degree of effort would need to be spent on optimizing the environment, in order to allow for the practical aspects of (1) to be developed and continually improved, but this article focuses entirely on environmental optimization. I wonder if this is due to a (subconscious?) moralistic belief that pleasure should be "meaningful", and that pleasure which is not related to real life outcomes is "bad". Or perhaps it is simply due to a lack of familiarity with the current and potential ways of directly acting on our brains (I doubt there would be a lack of familiarity with the theory behind this, but perhaps a lack of personal experience could result in the concept remaining abstract and not as well understood).

I think you may be overlooking that this is a guide for fictional utopias. I’m not sure a good story could be written about a world full of humans in a vegetative bliss state. But maybe it can! :)

In The Fun Theory Sequence Eliezer writes about the real life applications of this, eg.

Fun Theory is the field of knowledge that deals in questions such as "How much fun is there in the universe?", "Will we ever run out of fun?", "Are we having fun yet?" and "Could we be having more fun?"

and

If no one can imagine a Future where anyone would want to live, that may drain off motivation to work on the project.  The prospect of endless boredom is routinely fielded by conservatives as a knockdown argument against research on lifespan extension, against cryonics, against all transhumanism, and occasionally against the entire Enlightenment ideal of a better future.

So I don't think the article was intended only as a guide to authors.