Here are two self-contained algorithmic questions that have come up in our research. We're offering a bounty of $5k for a solution to either of them—either an algorithm, or a lower bound under any hardness assumption that has appeared in the literature.
Question 1 (existence of PSD completions): given entries of an matrix, including the diagonal, can we tell in time whether it has any (real, symmetric) positive semidefinite completion? Proving that this task is at least as hard as dense matrix multiplication or PSD testing would count as a resolution.
Question 2 (fast “approximate squaring”): given and a set of entries of , can I find some PSD matrix that agrees with in those m entries in time ?
We'll pay $5k for a solution to either problem. The offer is open for each problem for 3 months or until the problem gets solved (whichever happens first). Winners are welcome to publish solutions independently. Otherwise, if the result ends up being a significant part of a paper, we’ll invite them to be a coauthor.
We’ll also consider smaller prizes for partial progress, or anything that we find helpful for either solving the problem or realizing we should give up on it.
To understand the motivation for these questions, you can read our paper on Formalizing the presumption of independence and in particular Appendix D.7.2. ARC is trying to find efficient heuristic estimators as a formalization of defeasible reasoning about quantities like the variance of a neural network's output. These two questions are very closely related to one of the simplest cases where we haven't yet found any reasonable linear time heuristic estimator.
We don’t expect to receive many incorrect proposals, but if we receive more than 5 we may start applying a higher standard in order to save our time. If we can’t understand a solution quickly, we may ask you to provide more details, and if we still can’t understand it we may reject it. We expect a correct solution to be about as clear and easy to verify as a paper published at STOC.
For both problems, it’s OK if we incorrectly treat a matrix as PSD as long as all of its eigenvalues are at least for a small constant . hides polylogarithmic factors in , , and the max matrix entry. Feel free to ask for other clarifications on our question on Math Overflow, on Facebook, or by email.
To submit a solution, send an email to prize@alignment.org.
That's a good question. From what I've seen, PSD testing can be done by trying to make a Cholesky decomposition (writing the matrix as LL∗ with L lower-triangular) and seeing if it fails. The Cholesky decomposition is an LU decomposition in which the lower-triangular L and upper-triangular U are simply taken to have the same diagonal entries, so PSD testing should have the same complexity as LU decomposition. Wikipedia quotes Bunch and Hopcroft 1974 who show that LU decomposition can be done in O(nlog27) by Strassen, and presumably the more modern matrix multiplication algorithms also give an improvement for LU.
I also doubt that PSD testing is hard for matrix multiplication, even though you can get farther than you'd think. Given a positive-definite matrix A whose inverse we are interested in, consider the 2n×2n block matrix (AididC). It is positive-definite if and only if all principal minors are positive. The minors that are minors of A are positive by assumption, and the bigger minors are equal to det(A) times minors of C−A−1, so altogether the big matrix is positive-definite iff C−A−1 is. Continuing in this direction, we can get in O(PSD) time (times logϵ) any specific component of A−1. This is not enough at all to get the full inverse.