Wiki Contributions

Comments

I took the survey for the 2nd year in a row. Can't wait to see the results.

There are a few thoughts here. I mostly came here to read and educate myself on the rationality movement and this sort of thing. I think that LW is a tremendous resource of information and that that information should be collected as a resource and transported to a place like Medium where it can be read and experienced. I think it is very intimidating and I think everything can be organized into three broad categories: Philosophy, Technical, Theoretical

In short, it's time for some media organization and distribution in such a way that people can experience it in a holistic way.

I think it's also time to do some marketing and reach out. What is our message? How can we articulate that message? How can we organize around CFAR and MIRI to do that? It's not just AI but rationality in general. Groups are helpful but there is plenty of online resources to allow people to interact in their own way and make small contributions through conversation. It's not just improving this platform but making the amazing work that is happening here accessible as best we can. When someone writes something technical an effort can and should be made to create that as something much more for people to take in and incorporation. The Sequences are intimidating, how can we break that down into something digestible?

As for community, groups are a good idea to foster and support but creating online accountability groups and such can also be helpful. A social network aspect of this may be helpful as well. How can we provide support for a rationality community? How can we foster greater contributions?

As for helping people I think it's digestible articles, video, and blog posts that have to be created in order to make it really accessible in a fun and exciting way that people can actually use. Could we create some teams for this?

Just putting out ideas here. I think there can be life in LW if we create something new and novel.

I know I quit commenting because there just wasn't much going on.

I took more leisure time away from the big business of The Cameron Cowan Show.

And I guess I'm saying that the sooner we think about these sorts of things the better off we'll be. Going for pleasure good/suffering bad reduced the mindset of AI to about 2 years old. Cultural context gives us a sense of maturity Valence or no.

You should read The Big Sort by Bill Bishop, he talks about how in America we are literally and physically moving towards areas that favor our political and social ideas. This makes local control easy and national control impossible.

I can't apply for the News Editor job as I am too busy with my own work but I would like to contribute and perhaps help with promotion across The Cameron Cowan Show network. Let's chat: cameron@cameroncowan.net

Where is the cultural context in all of this? How does that play in? Pain and pleasure here in the West is different than in the East just as value systems are different. When it comes to creating AGI I think a central set of agreed upon tenets are important. What is valuable? How can we quantify that in a way that makes sense to create AGI? If we want to reward it for doing good things, we have to consider cultural validation. We don't steal, murder or assault people because we have significant cultural incentive not to do so, especially if you live in a stable country. I think that could help. If we can somehow show group approval of the AGI, like favorable opinions, verbal validation and other things that I intrinsically values as we do. We could use our own culture to reinforce norms within it's archetecture.

We are the people who knew too much.....

What is your measure? Does it stem from the lack of satisfaction in their work? Their lack of analysis? I feel like word count is not necessary. Zizek is also very accessible because he works in Lacanian psychoanalysis....I need more data!

I would like to talk to you more about this for my blog. Please msg me.

Load More