Wiki Contributions

Comments

Sorted by
Answer by kaputmi2-1

I'd recommend reading Stephen Wolfram on this question. For instance: https://www.wolframscience.com/nks/p315--the-intrinsic-generation-of-randomness/

  • Building a superhuman AI focused on a specific task is more economically valuable than building a much more expensive AI that is bad at a large number of things.

It also comes with ~0 risk of paperclipping the world — Alphazero is godlike at chess without needing to hijack all resources for its purposes

Yes, I think performance ultimately matters much more than risk preferences. If you really want to take that into account you can just define utility as a function of wealth, and then maximize the growth of utility instead. But I think risk-aversion has been way overemphasized by academics that weren't thinking about ergodicity, and were thinking along St Petersburg Paradox lines that any +EV bet must be rational, so when people don't take +EV bets they must be irrationally risk-averse.

Answer by kaputmi60

What you actually want is to maximize the growth rate of your bankroll. You can go broke making +EV bets. The Kelly Criterion is the solution you're looking for for something like a lottery – a bet is "rational" iff the Kelly Criterion says you should make it.

Why wouldn’t AGI build a superhuman understanding of ethics, which it would then use to guide its decision-making? 

I think the gears-level models are really the key here.  Without a gears-level model, you are flying blind, and the outside view is very helpful when you're flying blind.  But with a solid understanding of the causal mechanisms in a system, you don't need to rely on others' opinions to make good predictions and decisions. 

Answer by kaputmi10

My advice:

  • You're certainly not going to go wrong with Harvard. The value of college is much more in the people you will meet than anything else, and Harvard's quality of student body is as high as anywhere. 
  • When I applied to college I judged schools based on the quality of their economics department because I was convinced I would become an academic economist. Turned out to be very wrong. I think the chances you end up working on alignment research are low — maybe 20% — so don't over-index on that. 
  • Of course you can take advanced courses in whatever you want! Information is free these days. Don't let the school's curriculum dictate what you pursue. It's OK to optimize to some degree for easy classes so long as you are doing something valuable with the free time you are gaining.