[Cross-posted to the EA Forum]

TL;DR: I propose a form of conditional supplemental income (CSI) for humans once AI’s can do basically all work. This would be on top of a universal basic income (UBI), and people would earn the CSI by doing things that help improve their overall well-being.

 

Let’s say humanity somehow figures out how to safely align AGI’s through to ASI’s, and, at some point in the future, AI’s plus humanoid robots have created a “world of abundance.” We have limitless clean energy, cures have been found for all major diseases and human lifetimes have been greatly if not indefinitely extended. Also, basically no human has to work any more - AI’s plus robots can do practically all human jobs. In this world, I suspect that some form of universal basic income (UBI) would be used to provide everyone with at least a “minimum level” of goods and services to survive, with what constitutes “minimum level” being open to debate.

Although most things would be in abundance in such a world, there could still be a few things that are scarce relative to how much people want them.[1] Some examples include antiques, collectibles, the biggest natural diamonds and gemstones, and “remote getaways” in premium locations (e.g., near real mountains). Whatever is scarce or perceived as scarce could potentially become something desired by people, such as to show they have high status.[2] For allocating non-essential-for-survival resources (whether scarce or not), I propose a form of conditional supplemental income (CSI) that would be in addition to a UBI.

The purpose of this CSI would be to help motivate people who are not otherwise so motivated, or add to the motivation of those who are, to improve their overall experiences of life, or, said differently, their overall well-being. The best heuristic for consistently improving overall well-being, in my opinion, is increasing one’s self-esteem, and this involves taking more and more responsibility for one’s emotions and actions (to be in-line with one’s conscience). Consider Maslow’s Hierarchy of Needs. These include: Level 1 - physiological needs (food, air, water, etc.), Level 2 - safety and security, Level 3 - love and belonging, Level 4 - self-esteem, and Level 5 - self-actualization. A “world of abundance” would mean AI’s plus robots could ensure we readily get our level 1 and level 2 needs met. Some people may get their level 3 needs met by the company of AI’s/robots, while others may feel these can only be met by other humans and/or animals. Level 4 needs, i.e., for true self-esteem, can’t be met by an AI/robot providing something for us. Building true self-esteem involves effort by humans themselves, even if AI’s may be able to aid them by pointing them in the right direction. Level 5 needs also require effort by humans but could benefit from some help from AI’s in terms of providing (loose) direction and also some resources for self-expression. So the CSI I’m proposing would be geared towards Level 3 through 5 needs.

Companies/governments/AI’s that control the production and distribution of goods could offer CSI “credits” for people to earn by:

  1. Taking personal development/self-improvement/mental health-related courses such as those on emotional intelligence, those by Anthony Robbins, etc.
  2. Decreasing their anger triggers/showing more responsibility for their emotions
  3. Overcoming fears
  4. Interacting with other humans without the use of drugs or alcohol, since this often tests someone’s personal development such as by helping to identify areas in which they’re not taking full responsibility for their emotions, such as feeling like other people are pissing them off, not that they themselves are choosing anger for some internal reason
  5. Taking on challenges to build their character and raise their self-esteem - this could include having fitness goals and practicing self-denial such as through fasting
  6. Learning survival skills, farming, problem solving and “prepping” - anything towards self-reliance and robustness against mass technology failures
  7. Showing skill building/excellence in something - this could include art, dancing, music, sports, math, science, engineering, etc.

People could, of course, also take self-improvement and survival skills courses for free without earning any CSI credits if they so choose.

People who are initially motivated to work on their personal development to get more material things/status will likely find that by working on themselves, especially in taking responsibility for their own emotions, they’ll feel less need for material things and prefer more human contact (they’ll have less anxiety around interacting with other humans). They’ll realize that owning rare things may help them be seen as having high status in the eyes of others, but it makes no difference to their actual self-worth and ultimate happiness. So the above form of CSI will tend to decrease the motivation to work to get the CSI to obtain scarce things, and increase the motivation to do the CSI-work for its own sake to increase one’s overall well-being, as well as to get resources that aid in one’s self-expression (such as equipment/materials for sports, the arts and sciences).

  1. ^

    I’m implicitly assuming here that at least some people haven’t been bioengineered and/or brain chipped to function significantly differently in their psychology and physiology, such as pain responses, than people today. It is to this group of people that the CSI I’m talking about in this post would apply.

  2. ^

    The only way around a perceived scarcity of something would be if people were hooked up to “experience machines” that could provide them with whatever they wanted in a virtual world (or perhaps had brain chips that could steer them away from desiring things that were scarce). But the “real” world as we know it is still going to appeal to some people, so it’s unlikely everyone will want to be hooked up to an experience machine (or accept a brain chip).

New Comment
2 comments, sorted by Click to highlight new comments since:

I think you're on to something!

To my taste, what you propose is slightly more specific than required. What I mean, at least for me, the essential takeaway from your reading is a bit broader than what you explicitly write*: A bit of paternalism by the 'state', incentivizing our short-term self to doing stuff good for our long-term self. Which might become more important once the abundance means the biggest enemies to our self-fulfillment are internal. So healthy internal psychology can become more crucial. And we're not used to taking this seriously, or at least not to actively tackling that internal challenge by seeking outside support.

So, the paternalistic incentives you mention could be cool.

Centering our school system, i.e. the compulsory education system, more around this type of a bit more mindful-ish things, could be another part.

Framing: I'd personally not so much frame it as 'supplemental income', even if it also act as that: Income, redistribution, making sure humans even once unemployed are well fed, really shall come from UBI (plus if some humans in the loop remain really bottleneck, all scarcity value for their deeds to go to them, no hesitation), full stop. But that's really just about framing. Overall I agree, yes, some extra incentive payments would seem all in order. To the degree that the material wealth they provide still matters in light of the abundance. Or, even, indeed, in a world where bad psychology does become a major threat to the otherwise affluent society, it could be even an idea to withhold a major part of the spoils from useful AI, just to be able to incentivize use to also do our job to remain/become sane.

*That is, at least I'm not spontaneously convinced exactly those specific aspects you mention are and will remain the most important ones, but overall such types of aspects of sound inner organization within our brain might be and remain crucial in a general sense.

Thanks for the comment! Perhaps I was more specific than needed, but I wanted to give people (and any AI's reading this) some concrete examples. I imagine AI's will someday be able to optimize this idea. 

I would love it if our school system changed to include more emotional education, but I'm not optimistic they would do this well right now (due in part to educators not having experience with emotional education themselves). Hopefully AI's will help at some point.