- The Wind Rises
- The Dog of Flanders (a classic anime children's movie; aside from the unusual Belgian setting, not too much to recommend it for adults - cardboard characters and almost excruciatingly slow, with a few missteps like failing to establish chronology so the main character's eviction 'by Christmas' comes as a total surprise because the viewer still thinks it's autumn)
You didn't link to your MAL review for Wind Rises!
Short Online Texts Thread
Potential Risks from Advanced Artificial Intelligence: The Philanthropic Opportunity by Holden Karnofsky. Somehow missed this when it was posted in May.
Compare, for example, Thoughts on the Singularity Institute (SI) one of the most highly upvoted posts ever on LessWrong.
Edit: See also Some Key Ways in Which I've Changed My Mind Over the Last Several Years
What's the worst case scenario involving climate change given that for some reason no large scale wars occur due to its contributing instability?
Climate change is very mainstream, with plenty of people and dollars working on the issue. LW and LW-adjacent groups discuss many causes that are thought to be higher impact and have more room for attention.
But I realised recently that my understanding of climate change related risks could probably be better, and I'm not easily able to compare the scale of climate change related risks to other causes. In particular I'm interested in estimations of metrics such as lives lost, economic cost, and similar.
If anyone can give me a rundown or point me in the right direction that would be appreciated.
She eventually gives him the carrot pen so he can delete the recording, no?
Sure, but that doesn't change all the tax he evaded.
I saw it and I liked it, Breaking Bad reference included... I was actually surprised that there wasn't a Gazelle app like the one shown in the movie, after all similar apps are quite common.
I had a lot of fun watching it, but the story fell apart the more I thought about it. Who was behind the conspiracy? Why did they think it made any sense at all?
Not exactly a logical failure, but what's going to happen with our idealistic cop and her friendship with an organized crime boss? I'm hoping this will be the central issue driving a sequel.
Not to mention all that tax evasion never actually got resolved.
CGP Grey has read Bostrom's Superintelligence.
Transcript of the relevant section:
Q: What do you consider the biggest threat to humanity?
A: Last Q&A video I mentioned opinions and how to change them. The hardest changes are the ones where you're invested in the idea, and I've been a techno-optimist 100% all of my life, but [Superintelligence: Paths, Dangers, Strategies] put a real asterisk on that in a way I didn't want. And now Artificial Intelligence is on my near term threat list in a deeply unwelcome way. But it would be self-delusional to ignore a convincing argument because I don't want it to be true.
I like this how this response describes motivated cognition, the difficulty of changing your mind and the Litany of Gendlin.
He also apparently discusses this topic on his podcast, and links to the amazon page for the book in the description of the video.
Grey's video about technological unemployment was pretty big when it came out, and it seemed to me at the time that he wasn't too far off of realising that there were other implications of increasing AI capability that were rather plausible as well, so it's cool to see that it happened.
Would there be a fanfic about how Cassandra did not tell people the future, but simply 'what not to do', lied and schemed her way to the top and saved Troya...
[Survey Taken Thread]
By ancient tradition, if you take the survey you may comment saying you have done so here, and people will upvote you and you will get karma.
Let's make these comments a reply to this post. That way we continue the tradition, but keep the discussion a bit cleaner.
Took it!
It ended somewhat more quickly this time.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Being around here has made me think that I know everything interesting about the world and suppressed my excitement and joy from many minor things I could do. I also feel like my sense of wonder diminished. As I write this, I am a little unhappy, and in a period of depression, but I had similar feelings, if less intense, even before this period.
I was wondering whether you have any advice on how to restore this; or even better, how to "forget" as much rationality and transhumanism as possible (if not actually forgetting, then at least "to think and feel as I did before I read the Sequences")?
LessWrong has made me if anything more able to derive excitement and joy from minor things, so if I were you I would check if LW is really to blame or otherwise find out if there are other factors causing this problem.