One of the mandates of NASA's Armstrong Flight Research Center (AFRC) is participating in the flight testing of experimental aircraft, which includes monitoring these tests with long-range, ground-based cameras. Because these cameras track and capture flight tests occurring multiple kilometers away, the imagery collected is often degraded by the atmospheric turbulence between the camera and subject. In the summer of 2015, EM Photonics delivered the ATCOM TM-1, a rack-mountable system that is capable of taking a live HD-SDI video from a NASA long-range tracking camera, enhancing that video in real time, and outputting the resulting video in the same format; however, the current approach still requires user configuration to achieve the best results. The focus of our work in this project will be on both automating system configuration to adjust automatically to changing system and scene parameters, as well as improving human factors related to operator's use of an inline video processing solution. The former requires research on methods for estimating turbulence and determining motion in complex videos with significant distortion and warping. In the course of this project, we will develop technology in four primary areas, each of which are useful in themselves but with the ultimate goal of including them as features in the ATCOM TM-1 system currently used by NASA AFRC.