The sensor footprint component gathers real-time input from the unmanned vehicle’s position (latitude, longitude, altitude) and the pan, tilt, zoom parameters of an EO/IR payload to determine the sensor’s actual projection onto the ground. Since the algorithm takes into account Digital Terrain Elevation Data (DTED), the sensor footprint can be overlaid in both 2D and 3D environments, offering exceptional situational awareness of where the sensor has pointed, where it is pointing, and where it is capable of pointing.