All of javva209's Comments + Replies

I'm rated ~1600 on Lichess and would participate in whichever role that rating fits best with. 

I have some questions, such as which time controls are used but am most interested in how you plan on having the "C" group give advice to the "A" group. Would they just give notation (for instance Re1) or would they type or speak a little bit of context alongside such as "Centralize your pieces" or "complete your development by activating your rook"? 

I live in the Bay Area and am generally available on weekdays anytime after 5PST. 

My only concern i... (read more)

1Zane
Unsure about the time controls at the moment; see my response to aphyer. The advisors would be able to give the A player justification for the move they've recommended. The concern that A might not be able to understand the reasoning that the advisors give them is a valid one, and that's the whole point of the experiment! If A can't follow the reasoning well enough to determine whether it's good advice, then (says the analogy) people who are asking AIs how to solve alignment can't follow their reasoning well enough to determine whether it's good advice.

is AGI inconsistent with the belief that there is other sentient life in the universe? If AGI is as dangerous as Eliezer states, and that danger is by no means restricted to earth much less our own solar system. Wouldnt alien intelligences (both artificial and neural) have a strong incentive to either warn us about AGI or eliminate us before we create it for their own self preservation?
So either we arent even close to AGI and intergalactic AGI police arent concerned, or AGI isnt a threat, or we are truly alone in the universe, or the universe is so vast an... (read more)

1jrincayc
I agree with your comment.  Also, if any expansionist, deadly AGI existed in our galaxy say, 100,000 years ago, it would already have been to Earth and wiped us out.  So we kind of can rule out nearby expansionists deadly AGIs (and similar biological aliens).  What that actually tells us about the deadlyness of AGIs is an interesting question.  It is possible that destruction by AGI (or some other destructive technological event) are usually are fairly localized and so only destroy the civilization that that produced them. Alternatively, we just happen to be in one of the few quantum branches that has not yet been wiped out by an ED-AGI, and we are only here discussing it because of survival bias.