Cameras work in pixel and angular domain, spatial distances don't really matter. Infact target discrimination in IR against mostly cold background is much easier than optical discrimination at low altitudes against a cluttered background.Except for the difference of course that the speeds, distances and altitudes of the respective platform and targets that each is oriented towards.
The problem I have with your suggestion is it implies that the same technologies on a consumer quadcopter drone (as advanced as they are) could offer a major capability uplift if they were transplanted to a military fighter jet.
Obviously military equipment has their unique set of requirements, but physics work the same no matter if you're civilian or military, the distinction between military and civilian technology is often not nearly as big as most people think.
Case in point fighter EODAs sensor fusion obviously have added requirement to defeat dazzlers and targets that actively try to prevent tracking, and might have filters to discriminate missiles from aircraft, but at end of the day you're still tracking aerial targets and then taking action based on tracking data, the technology isn't fundamentally different.