The situation reminds me of the relationship between math and physics. When the necessary mathematical tools are missing, physicists invent the crude new ones to "just get it done", rather than wait for the math to catch up. Then the mathematicians swoop in and refine, polish, beautify and admire them, and scoff at the ugly earlier implements.
Some notable examples that come to mind are calculus, conservation laws, Dirac's delta and bra-ket notation in quantum mechanics, path integrals, renormalization.
Lest some overzealous reader misunderstands my point, I do not intend to badmouth math and mathematicians. The elegant new tools often lead to better understanding of the underlying physical phenomena and consequently new discoveries in physics. The same can rarely be said of philosophy.
Some notable examples that come to mind are calculus, conservation laws, Dirac's delta and bra-ket notation in quantum mechanics, path integrals, renormalization.
Calculus is not a fair example, because the disciplines of mathematics and physics were not separated at the time; the work of Newton and Leibniz was the best "mathematics" (as well as "physics") of the era. Noether's theorem is not an example at all but a counterexample: a mathematical theorem, proved by a mathematician, that provided new insights into physics.
Dirac's delta and Feynman path integrals are fair examples of your point.
In Less Wrong Rationality and Mainstream Philosophy, Conceptual Analysis and Moral Theory, and Pluralistic Moral Reductionism, I suggested that traditional philosophical conceptual analysis often fails to be valuable. Neuroscientist V.S. Ramachandran has recently made some of the points in a polite sparring with philosopher Colin McGinn over Ramachandran's new book The Tell-Tale Brain: