Whilst the LEDs are not around the corner, I think the Kr-Cl excimer lamps might already be good enough.
When we wrote the original post on this, it was not clear how quickly covid was spreading through the air, but I think it is now clear that covid can hang around for a long time (on the order of minutes or hours rather than seconds) and still infect people.
It seems that a power density of 0.25W/m^2 would probably be enough to sterilize air in 1-2 minutes, meaning that a 5m x 8m room would need a 10W source. Assuming 2% efficiency that 10W source needs 500W electrical, which is certainly possible and in the days of incandescent lights you would have had a few 100W bulbs anyway.
EDIT: Having looked into this a bit more, it seems that right now the low efficiency of excimer lamps is not a binding constraint because the legally allowed far-UVC exposure is so low.
"TLV exposure limit for 222 nm (23 mJ cm^−2)"
23 mJ per cm^2 per day is just 0.002 W/m^2 , so you really don't need much power until you hit legal limitations.
Yes, certain places like preschools might benefit even from an isolated install.
But that is kind of exceptional.
The world isn't an efficient market, especially because people are kind of set in their ways and like to stick to the defaults unless there is strong social pressure to change.
Far-UVC probably would have a large effect if a particular city or country installed it.
But if only a few buildings install it, then it has no effect because people just catch the bugs elsewhere.
Imagine the effect of just treating sewage from one house, and leaving all the untreated sewage from a million houses untreated in the river. There would be essentially no effect.
ok so from the looks of that it basically just went along with a fantasy he already had. But this is an interesting case and an example of the kind of thing I am looking for.
ok, but this is sort of circular reasoning because the only reason people freaked out is that they were worried about AI risk.
I am asking for a concrete bad outcome in the real world caused by a lack of RLHF-based ethics alignment, which isn't just people getting worried about AI risk.
alignment has always been about doing what the user/operator wants
Well it has often been about not doing what the user wants, actually.
giving each individual influence over the adoption (by any clever AI) of those preferences that refer to her.
Influence over preferences of a single entity is much more conflict-y.
Trying to give everyone overlapping control over everything that they care about in such spaces introduces contradictions.
The point of ELYSIUM is that people get control over non-overlapping places. There are some difficulties where people have preferences over the whole universe. But the real world shows us that those are a smaller thing than the direct, local preference to have your own volcano lair all to yourself.
catgirls are consensually participating in a universe that is not optimal for them because they are stuck in the harem of a loser nerd with no other males and no other purpose in life other than being a concubine to Reedspacer
And, the problem with saying "OK let's just ban the creation of catgirls" is that then maybe Reedspacer builds a volcano lair just for himself and plays video games in it, and the catgirls whose existence you prevented are going to scream bloody murder because you took away from them a very good existence that they would have enjoyed and also made Reedsapcer sad.
Since a few people have mentioned the Miller/Rootclaim debate:
My hourly rate is $200. I will accept a donation of $5000 to sit down and watch the entire Miller/Rootclaim debate (17 hours of video content plus various supporting materials) and write a 2000 word piece describing how I updated on it and why.
Anyone can feel free to message me if they want to go ahead and fund this.