Would you like to be tutored in applied game theory, natural latents, CFAR-style rationality techniques, "general AI x-risk", Agent Foundations, anthropics, or some other topics discussed on LessWrong?

I'm thinking about prototyping some topic-specific LLM tutor bots, and would like to prioritize topics that multiple people are interested in.

Topic-specific LLM tutors would be customized with things like pre-loaded relevant context, helpful system prompts, and more focused testing to ensure they work.

Note: I'm interested in topics that are written about on LessWrong, e.g. infra-bayesianism, and not magnetohydrodynamics".


I'm going to use the same poll infrastructure that Ben Pace pioneered recently. There is a thread below where you add and vote on topics/domains/areas where you might like tutoring.

  1. Karma: upvote/downvote to express enthusiasm about there being tutoring for a topic.
  2. Reacts: click on the agree react to indicate you personally would like tutoring on a topic.
  3. New Poll Option. Add a new topic for people express interest in being tutored on.

For the sake of this poll, I'm more interested in whether you'd like tutoring on a topic or not, separate from the question of whether you think a tutoring bot would be any good. I'll worry about that part.

Background

I've been playing around with LLMs a lot in the past couple of months and so far my favorite use case is tutoring. LLM-assistance is helpful via multiple routes such as providing background context with less effort than external search/reading, keeping me engaged via interactivity, generating examples, and breaking down complex sections into more digestible pieces.

New Comment
10 comments, sorted by Click to highlight new comments since:
[-]Ruby20

Poll for LW topics you'd like to be tutored in
(please use agree-react to indicate you'd personally like tutoring on a topic, I might reach out if/when I have a prototype)

Note: Hit cmd-f or ctrl-f (whatever normally opens search) to automatically expand all of the poll options below.