Also, it looks like the last time slot is 2200 UTC. I can participate from 1900 and forward.
I will promote this in the AI Safety reading group tomorrow evening.
Can I get an email to invite to the hangout?
Also I've nailed down the time if people see this in the comments.
We have a number of charities that are working on different aspects of AGI risk
- The theory of the alignment problem (MIRI/FHI/more)
- How to think about problems well (CFAR)
However we don't have body dedicated to making and testing a coherent communication strategy to help postpone the development of dangerous AIs.
I'm organising an on-line discussion around what we should do about this issue next saturday.
In order to find out when people can do it, I've created a doodle here. I'm trusting that doodle works well with timezones. The time slots should be between 1200 and 2300 UTC , let me know if they are not.
We'll be using the optimal brainstorming methodology
Give me a message if you want an invite, once the time has been decided.
I will take notes and post them here again.