Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: bogus 20 March 2017 06:26:02PM *  0 points [-]

To me, "democratize AI" makes as much sense as "democratize smallpox", but it would be good to find out that I'm wrong.

Isn't "democratizing smallpox" a fairly widespread practice, starting from the 18th century or so - and one with rather large utility benefits, all things considered? (Or are you laboring under the misapprehension that the kinds of 'AIs' being developed by Google or Facebook are actually dangerous? Because that's quite ridiculous, TBH. It's the sort of thing for which EY and Less Wrong get a bad name in machine-learning- [popularly known as 'AI'] circles.)

Comment author: moridinamael 20 March 2017 09:30:57PM 0 points [-]

Not under any usual definition of "democratize". Making smallpox accessible to everyone is no one's objective. I wouldn't refer to making smallpox available to highly specialized and vetted labs as "democratizing" it.

Google and/or Deepmind explicitly intend on building exactly the type of AI that I would consider dangerous, regardless of whether or not you would consider them to have already done so.

Comment author: Lumifer 20 March 2017 03:57:26PM 0 points [-]

Links to the noises?

Comment author: moridinamael 20 March 2017 04:03:12PM *  0 points [-]

It's mainly an OpenAI noise but it's been parroted in many places recently. Definitely seen it in OpenAI materials, and I may have even heard Musk repeat the phrase, but can't find links. Also:


Our long-term goal is to democratize AI. We want to level the playing field for startups to ensure that innovation doesn’t get locked up in large companies like Google or Facebook. If you’re starting an AI company, we want to help you succeed.

which is pretty close to "we don't want only Google and Facebook to have control over smallpox".

Microsoft in context of partnership with OpenAI.

At Microsoft, we believe everyone deserves to be able to take advantage of these breakthroughs, in both their work and personal lives.

In short, we are committed to democratizing AI and making it accessible to everyone.

This is a much more nonstandard interpretation of "democratize". I suppose by this logic, Henry Ford democratized cars?

Comment author: Lumifer 20 March 2017 03:17:03PM 1 point [-]

Why do you think one exists?

Comment author: moridinamael 20 March 2017 03:55:33PM *  1 point [-]

I try not to assume that I am smarter than everybody if I can help it, and when there's a clear cluster of really smart people making these noises, I at least want to investigate and see whether I'm mistaken in my presuppositions.

To me, "democratize AI" makes as much sense as "democratize smallpox", but it would be good to find out that I'm wrong.

Comment author: moridinamael 20 March 2017 03:09:47PM 2 points [-]

What is the steelmanned, not-nonsensical interpretation of the phrase "democratize AI"?

Comment author: moridinamael 16 March 2017 12:03:54PM 7 points [-]

This perspective suggests "don't yak shave" is a classic deepity. The superficial true meaning is "don't waste time on unimportant sub tasks" and the clearly false but immediately actionable meaning is "don't do subtasks". If you've already clearly identified which tasks are on the critical path and which are not, the yak shaving heuristic is useless, and if you haven't, it's harmfully misleading.

Comment author: ArisKatsaris 01 March 2017 10:54:27PM 0 points [-]

Podcasts Thread

Comment author: moridinamael 10 March 2017 05:28:57PM 1 point [-]

Me and my cohost are starting a podcast series read-through of the web serial Worm. So far it's our most popular feature ... by a factor of ten or so.

Comment author: ChristianKl 07 March 2017 09:25:00AM 0 points [-]

Did any of you do typing training to increase your typing speed? If so, how much time investment did you need for a significant improvement?

Comment author: moridinamael 07 March 2017 02:46:40PM *  0 points [-]

I changed schools during the period of time when typing classes are generally taught, this led to me taking two years of typing classes. I could type pretty goddamn fast at the end of this period. That was about twenty years ago. I just tested myself at this site and average about 92 words per minute, which is still pretty fast. I actually feel like ~90 WPM is close to the physical limit at which point you are making most of your errors because your fingers are hitting keys out of order due to the signals being sent milliseconds apart.

I doubt that I really needed two years of training to get good at typing, but on the other hand, there's something to be said for over-learning.

Comment author: DataPacRat 06 March 2017 06:36:21AM 2 points [-]

Writing Scifi: Seeking Help with a Founding Charter

I'm trying to figure out which details I need to keep in mind for the founding charter of a particular group in a science-fiction story I'm writing.

The sci-fi bit: The group is made up of copies of a single person. (AIs based on a scan of a human brain, to be precise.)

For example, two copies may have an honest disagreement about how to interpret the terms of an agreement, so having previously arranged for a full-fledged dispute-resolution mechanism would be to the benefit of all the copies. As would guidelines for what to do if a copy refuses to accept the results of the dispute-resolution, preliminary standards to decide what still counts as a copy in case it becomes possible to edit as well as copy, an amendment process to improve the charter as the copies learn more about organizational science, and so on. The charter would likely include a preamble with a statement of purpose resembling, "to maximize my future selves' ability to pursue the fulfillment of their values over the long term".

The original copy wanted to be prepared for a wide variety of situations, including a copy finding itself seemingly all alone in the universe, or with a few other copies, or lots and lots; which may be running at similar, much faster, or much slower speeds; with none, a few, or lots and lots of other people and AIs around; and with or without enough resources to make more copies. So the charter would need to be able to function as the constitution of a nation-state; or of a private non-profit co-op company; or as the practical guidelines for a subculture embedded within a variety of larger governments (ala the Amish and Mennonites, or Orthodox Jews, or Romany). Ideally, I'd like to be able to include the actual charter in an appendix, and have people who understand the topic read it, nod, and say, "Yep, that'd do to start with."

At the moment, I'm reading up on startup companies, focusing on how they transition from a small team where everyone does what seems to need doing into more organized hierarchies with defined channels of communication. But I'm sure there are important details I'm not thinking of, so I'm posting this to ask for suggestions, ideas, and other comments.

So: What can you, yourself, suggest about writing such a charter; and where else can I learn more about authouring such texts?

Thank you for your time.

Comment author: moridinamael 06 March 2017 04:36:07PM 0 points [-]

In lieu of coming up with a creative solution to your problem, I will relate how Hannu Rajaniemi solves this problem in the Quantum Thief books, particular for the group called the Sobornost. (Spoilers, obviously.) There are billions (trillions?) of copies of certain individuals, and each copy retains self-copying rights. Each copy knows which agent forked it (who its "copyfather" is), and is programmed to feel "religious awe" and devotion to its specific line of descent. So if you found yourself spawned in this world, you would feel strong awe and obedience for your copyfather, even stronger awe and obedience for your copygrandfather, and ultimate devotion to the "original" digital version of yourself (the "prime"). This policy keeps everyone in line and assists in conflict resolution, because there's always a hierarchy of authority among the copies. This also allows small groups of copies to go off and pursue a tangential agenda with trust that the agenda will be in line with what the prime individual wanted.

Comment author: Lumifer 02 March 2017 03:37:52PM 0 points [-]

Schwarzenegger was merely a governator. Since he was born outside of the US he is not eligible to run for President.

Comment author: moridinamael 02 March 2017 03:38:43PM 0 points [-]

Yet. Growth mindset.

Comment author: CellBioGuy 27 February 2017 08:53:38AM *  1 point [-]

Why is the end of a PhD maintained as such a stressful and panic-filled process?

Comment author: moridinamael 27 February 2017 04:44:34PM *  2 points [-]

Aligned with what Dagon said, there is no particular incentive driving for it to be otherwise.

Somewhat more cynically, I have observed that it is very much to the professor's advantage to be able to credibly assert that you are "still N months away from graduating", where N is whatever number they feel like using in that moment to manipulate you to whatever ends they wish. Maybe they want to dissuade you from taking an internship. Maybe they want you to focus more on your research and less on your courses. Regardless, it seems the universal state is that the academic adviser will push back on your stated graduation date objectives, and since the entire system is designed around inhibiting any kind of firm commitment to a graduation plan, they can always win the argument.

For me, a lot of the panic came from the unreasonable number of documents that needed to be filed with certain gaps of time between them, such that you need to have document A signed and filed several months before being permitted to submit document B, and then document C could only be filed if document B was accepted, but it took an uncertain amount of time for that to happen, and meanwhile the deadline for graduation is approaching, etc.

View more: Next