I agree that it's important to optimize our vibes. They aren't just noise to be ignored. However, I don't think they exist on a simple spectrum from nice/considerate/coddling to mean/callous/stringent. Different vibes are appropriate to different contexts. They don't only affect people's energy but also signal what we value. Ideally, they would zap energy from people who oppose our values while providing more energy to those who share our values.
Case in point, I was annoyed by how long and rambly your comment was and how it required a lot of extra effort to distill a clear thesis from it. I'm glad you actually did have a clear thesis, but writing like that probably differentially energizes people who don't care.
You could handle both old and new scrapes by moving the content to a different URL, changing the original URL to a link to the new URL, and protecting only the new URL from scraping.
As I see it, a fatal problem with CEV is that even one persistent disagreement between humans leaves the AI unable to proceed, and I think such disagreements are overwhelmingly likely to occur. Adding other sentient beings to the mix only makes this problem even more intractable.
EDIT: I should clarify that I'm thinking of cases where no compromise is possible, e.g. a vegan vs. a sadist who derives their only joy from torturing sentient animals. You might say sadists don't count, but there's no clear place to draw the line of how selfish someone has to be to have their values disregarded.
EDIT 2: Nevermind, just read this comment instead.
Do you think anything is ever bad enough that it deserves to be rudely dismissed or sneered at? Or is that unacceptable to you in any possible context?
It seems like this accusation of bad faith could go both ways. I haven't seen you demonstrate curiosity or openness to being convinced that your religion pushes anti-epistemology, I've only seen flat denial followed by casting of aspersions.
I agree that the money pump argument is confusing. I think the real problem with intransitive preferences is that they're self-contradictory. If I have an intransitive preference and you simply ask me whether I want A, B, or C, I am unable to answer. As soon as I open my mouth to say something I already want to change my mind, and I'm stuck. However, there's some conceptual slippage to merely wanting to move in a cycle between states over time, such as wanting to travel endlessly from Chicago to New York to San Francisco and back to Chicago again. This may be considered silly and wasteful but there's nothing inherently illogical about it.
But why would the people who are currently in charge of AI labs want to do that, when they could stay in charge and become god-kings instead?
Okay, but you're not comparing like with like. Terminator 2 is an action movie, and I agree that action movies have gotten better since the 1960s. But in terms of sci-fi concepts introduced per second, I would suspect 2001 has more. Some movies from the 1990s that are more straight sci-fi would be Gattaca or Contact, but I don't think many people would consider these categorically better than 2001.
Women are using AI models to create "better" versions of their face and then asking plastic surgeons to make them look like that. So even if the surgery comes out exactly as intended, the effect is to make people look more like AI slop in real life. But apparently AI slop is like that because it's what the modal person tends to upvote, so a lot of people won't see any problem.