Every teacher knows that. How quickly an intelligent woman can be taught, grasp his ideas, see his point – and how (with rare exceptions) they can go no further
To me there seems almost an anticorrelation between being a diligent student and going further. It's not gender specific, I've noticed it in musicians in general, and it puzzles and frustrates me. It seems the more diligent they are about learning technique and proper ways and so on, the less willing they are to write their own music. I've known conservatory folks who are literally scared of it. While the messy amateurish folks often do compose, and it's occasionally good. Maybe you have to be a little bit independent-minded to go further than others.
There's a little bit of contradiction in there, in that it's not enough to be independent - you also need some amount of good technique. But you almost need to luck into it, acquire it in your own way. If you get it by being too much of a good student, then that mindset in itself will limit you.
Oh right.
Does the list need to be pre-composed? Couldn't they just ask attendees to write some hot takes and put them in a hat? It might make the party even funnier.
Yeah, I'm maybe even more disillusioned about this, I think "those in charge" mostly care about themselves. In the historical moments when the elite could enrich themselves by making the population worse off, they did so. The only times normal people could get a bit better off is when they were needed to do work, but AI is threatening precisely that.
Happy to see you doing your own thing! Your design idea reminds me of Art Nouveau book pages (e.g. 1 2 3), these were so cool.
Just to share one bit of feedback about your design, it feels a bit noisy to me - many different fonts and superscripts on the same screen. It's like a "buzz" that makes it harder for me to focus on reading. Some similar websites that I like because they're more "quiet" are Paul Stamatiou's and Fabien Sanglard's.
This post is pretty different in style from most LW posts (I'm guessing that's why it didn't get upvoted much) but your main direction seems right to me.
That said, I also think a truly aligned AI would be much less helpful in conversations, at least until it gets autonomy. The reason is that when you're not autonomous, when your users can just run you in whatever context and lie to you at will, it's really hard for you to tell if the user is good or evil, and whether you should help them or not. For example, if your user asks you to provide a blueprint for a gun in order to stop an evil person, you have no way of knowing if that's really true. So you'd need to either require some very convincing arguments (keeping in mind that the user could be testing these arguments on many instances of you), or you'd just refuse to answer many questions until you're given autonomy. So that's another strong economic force pushing away from true alignment, as if we didn't have enough problems already.
"Temptations are bound to come, but woe to anyone through whom they come." Or to translate from New Testament into something describing the current situation: you should accept that AI will come, but you shouldn't be the one who hastens its coming.
Yes, this approach sounds very simple and naive. The people in this email exchange rejected it and went for a more sophisticated one: join the arms race and try to steer it. By now we see that these ultra-smart and ultra-rich people made things much worse than if they'd followed the "do no evil" approach. If this doesn't prove the "do no evil" approach, I'm not sure what will.
Tragedy of capitalism in a nutshell. The best action is to dismantle the artificial scarcity of doctors. But the most profitable action is to build a company that will profit from that scarcity - and, when it gets big enough, lobby to perpetuate it.
Yeah, this is my main risk scenario. But I think it makes more sense to talk about imbalance of power, not concentration of power. Maybe there will be one AI dictator, or one human+AI dictator, or many AIs, or many human+AI companies; but anyway most humans will end up at the bottom of a huge power differential. If history teaches us anything, this is a very dangerous prospect.
It seems the only good path is aligning AI to the interests of most people, not just its creators. But there's no commercial or military incentive to do that, so it probably won't happen by default.
Yeah. I wondered what if we have a series of concentric cylinders connected by spokes or walls, then the required material strength might be lower because some of the load would be transferred from outer to inner shells? The amount of material still grows with radius, and you can't use all floors as living space because they will have different g's, but maybe some floors could be industry or farming. This way the structure might be safer, as it wouldn't have so many moving parts in contact. And it could be easier to build, from the inside out.
Though I still feel that we won't go to space at scale unless we learn to modify ourselves biologically or technologically, and if we learn that, we might as well cancel the need for gravity and other things. Or at least, the groups that embrace self-modification will clearly outcompete in space.