5th generation military aircraft are extremely optimised to reduce their radar cross section. It is this ability above all others that makes the f-35 and the f-22 so capable - modern anti aircraft weapons are very good, so the only safe way to fly over a well defended area is not to be seen.
But wouldn't it be fairly trivial to detect a stealth aircraft optically?
This is what an f-35 looks like from underneath at about 10 by 10 pixels:
You and I can easily tell what that is (take a step back, or squint). So can GPT4:
The image shows a silhouette of a fighter jet in the sky, likely flying at high speed. The clear blue sky provides a sharp contrast, making the aircraft's dark outline prominent. The jet's design appears sleek and streamlined, with distinct wings and a tail fin, typical of modern military aircraft.
A specialised image classifier should also be able to do so no problem at all, and much faster.
The patriot radar has a range of about 100km. Let's say we wanted to match that.
An f-35s wingspan is 11m. I want to detect the F-35 at 100kms out in all directions. Then we need one pixel per metre, over the surface of a semisphere with radius 100km.
That equates to a bit less than one hundred billion pixels. You can buy phones with 100 million pixel cameras for under $1000, and the majority of the cost is not the camera. This suggests you could use an array to cover the entire sky for under a million dollars, an order of magnitude less than the cost of a radar. The rest of the infrastructure you would need would also cost, including some beefy GPUs, but not enough to increase the cost more than an order of magnitude.
You'd need two of these to perform triangulation and calculate the distance and size of the object, as well as to exclude false positives (small bird close to the camera). You'd probably want 3 for greater reliability. But that should be relatively simple to do.
This only works in good weather, but denying the enemy use of their aircraft in good weather seems pretty useful, especially in the middle east where it's continuously sunny half the year.
Also, unlike a radar, this doesn't advertise it's location whenever it's used.
So why isn't this done?
My best guesses are that it would be fairly easy to camouflage the f-35, or that the performance characteristics of this solution wouldn't be good enough to guide missiles, only to detect the existence of the aircraft.
You can’t just trivially scale up the angular resolution by bolting more sensors together (or similar methods). It gets more difficult to engineer the lenses and sensors to meet super-high specs.
And aside from that, the problem behaves nonlinearly with the amount of atmosphere between you and the plane. Each bit of distortion in the air along the way will combine, potentially pretty harshly limiting how far away you can get any useful image. This may be able to be worked around with AI to reconstruct from highly distorted images, but it’s far from trivial on the face of it.