I am not sure what exactly you mean by "safe" questions. Safe in what respect? Safe in the sense that humans can't do something stupid with the answer or in the sense that the Oracle isn't going to consume the whole universe to answer the question? Well...I guess asking it to solve 1+1 could hardly lead to dangerous knowledge and also that it would be incredible stupid to build something that takes over the universe to make sure that its answer is correct.
What if asking what the sum of 1+1 is causes the Oracle to devote as many resources as possible to looking for an inconsistency arising from the Peano axioms?
The Future of Humanity Institute wants to pick the brains of the less wrongers :-)
Do you have suggestions for safe questions to ask an Oracle? Interpret the question as narrowly or broadly as you want; new or unusual ideas especially welcome.