Augmented Reality for Precision Navigation: Enhancing Performance in High-Stress Operations - GPS World

Augmented Reality for Precision Navigation: Enhancing Performance in High-Stress Operations

June 1, 2012  - By
Augmented reality delivers two important military capabilities to the warfighter: situational awareness and precision piloting capabilities, both key to survival on the battlefield. Look-ahead drive-to-position, based on accurate GPS positions, extends the importance of GPS to high-speed operation or very close maneuvering situations where humans cannot cycle through a chart or map display, then place themselves in the real world to make maneuvering decisions.

By Thomas Zysk, Jeffory Luce, and James Cunningham

Augmented reality (AR) is a concept in daily use in the modern technology vernacular. In one popular form, AR enhances football broadcasts with overlaid information such as the first down line. A much more robust capability for application in high-performance navigation systems uses accurate GPS and heading sensors to geographically register a virtual world accurately over a real-world, real-time view. In a military context, AR can provide critical context to situational awareness.

AR for military use was originally developed as a maritime equivalent to the aviator’s heads-up display. Evaluations using a task-load index function showed a 342 percent improvement in side-task operator performance when using AR. Operators do not have to make the mental conversion from 2D (map or chart view) to 3D real-world view. This translation is where errors can be made in high-stress scenarios and forms the root cause of many accidents. AR provides a game-changing capability to enhance warfighter performance when it matters and is invaluable during high-stress, dynamic operations.

Photo approved for release by MC1(AW/SW) Michael W. Pendergrass, Fleet Public Affairs Center Atlantic, (757) 444-4199 ext 322

Amphibious assault vehicle (AAV), U.S. Marine Corps.

In this navigation context, AR was developed for use in low-visibility situations, such as navigating in dense fog or at night during lights-out missions. The technology can provide a visual depiction of critical points of interest, regardless of real-world visibilities. AR provides the means to integrate sensors and supporting geographic information system and related systems into a cohesive visual display that overcomes environment limitations or such things as closed-hatch operations on military vehicles.

AR delivers two important military capabilities to the warfighter: situational awareness and precision piloting capabilities, both key to survival on the battlefield.

Situational Awareness. Any information with a geographical registration component can be overlaid on the real-world view in a single composite display format. This can track data, threat locations, friendly-force locations, obstacles, and safe havens; the list grows each day. This information adds immensely to the operator’s understanding of the environment. This fused information, over a real-world, real-time view, is functionally an enhanced Common Operational Picture (COP). Operators can be more cognizant of the tactical situation day, night, or in any visibility condition.

Precision Piloting. The faster one drives in an automobile, the further down the road one must focus to stay on the highway. AR provides this look-ahead drive-to-position based on accurate GPS positions. This extends the importance of GPS to high-speed operation or very close maneuvering situations where humans cannot cycle through a chart or map display, then place themselves in the real world to make maneuvering decisions.

AR enables a rich suite of functions supporting the access and maintenance of a COP, and demonstrated maneuver accuracy. For the Augmented Reality Visualization for the Common Operational Picture (ARVCOP) system, any situational awareness information available can be overlaid on the real-world view in a clear and organized way. Operators do not have to go through the process of translating what they see on a map to what they see in front of them, a translation process that often incurs error. AR then delivers this to warfighters through a human-cognition friendly, integrated display of sensor data and geographically registered overlays, as Figure 1 illustrates. The AR view is shown along with a two-dimensional view on the right side of the display.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 1. ARVCOP display example.

Developed by the Office of Naval Research with industry partner Technology Systems Inc., ARVCOP provides a human-machine interface that can magnify the effectiveness of precision positioning. In this article, we discuss how AR is utilized in this context and the results of testing AR precision-navigation systems aboard Marine Corps amphibious assault vehicles (AAVs, see photo) on the beaches of Marine Corps Base Camp Pendleton, California.

Precision piloting, or driving accuracy, is achieved by providing the operator a point toward which to drive that is in relation to the current position. Testing showed that looking ahead or driving to a point forced the operator to self-correct for the effects of wind, waves, and current.

AR is exemplified by a software application that combines real-time video imagery with virtual images to provide a new dimension in navigation piloting accuracy. Figure 2 is an AR display on a ferry boat showing the navigational route marked by rails.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 2. Real world with augmented reality.

AR can overlay critical chart information such as buoys and channel markers, as well as radar or automated information system (AIS) contacts. In fact, any information that has a geo-registration component (geographic location attached) can be precisely overlaid on a real-time or infrared camera view. Operators have reported they are able to maneuver in unfamiliar waters at high speed with confidence, especially at night or in inclement weather (Figure 3).

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 3. Obscure visibility with augmented reality.

An operator using AR does not have to look down at a chart, radar, or AIS display, and then up at the real world to put the information into context. Charts, radar, and AIS output 2D information that must be made relevant to a 3D world. Analysis shows that converting 2D to 3D is a strenuous and error-prone task for the brain. Accidents can be caused by an initial mistake, which is then compounded by other decisions made with incorrect information. Figure 4 shows how AR automates the conversion process, allowing the human to focus on other relevant tasks.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 4. Augmented Reality Visualization of the Common Operational Picture (ARVCOP) block diagram.

R & D Hardware

AR applications on AAVs have demonstrated the technology’s utility on land, in water, and through the hazardous surf zone, delivering precise routing through cleared transit lanes. The system is intuitive to operate. Operators with little or no training in AR systems executed precise maneuvers through lanes planned with bends and turns. The AR system used a military GPS and heading device. Electronic chart and tactical data brought positional context to the display. A virtual world was created and software algorithms draped the virtual world over a real-world camera view creating an AR display (Figure 5) for the AAV test.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 5. AAV with research and development commercially available ARVCOP hardware.

Camp Pendleton Tests. In 2009, rigorous testing was completed for the ARVCOP system using AAVs in the surf at Marine Corps Base Camp Pendleton. Safe maneuver lanes were marked with mine-like objects and other hazards. Complex routes that included turns and zigzag patterns were planned toward the beach. Routes were delivered to vehicles using a radio circuit, and adjustments to the planned route were made on the fly to adapt to changing tactical situations.

The AAV is a 26-ton vehicle that is a challenge to operate when placed in a surface environment with wind, waves, and currents. Hardware employed ranged from legacy devices, including a magnetic heading device, to modern devices. With Research and Development (R&D) hardware, the results were dramatic compared to the traditional means of navigating assault lanes. The technology enabled new mission concepts, such as irregular routes ashore and avoidance of hazards sighted by other forces as the mission was in progress. The evaluation criteria for these tests were cross-track errors (CTEs), measured relative to a planned route. Separate, high-accuracy GPS was used for truth data to measure the accuracy of the route driven. Figure 6 shows the video camera and GPS antenna locations on the AAVs.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 6. Video camera is located directly beneath the GPS antenna.

Figure 7 gives an example of the resultant AR video imagery for the R&D commercially available hardware on the AAVs. Figure 8 shows the planned routes for the R&D test evaluations. The distance offshore was 946 meters, and the planned total route length was 1,990 meters.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 7. ARVCOP video using R&D hardware.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 8. Planned route for the R&D testing.

Video Augmentation Accuracy

To determine position accuracy of the augmented figures drawn on the video images, time encoded images were captured. The augmented images were captured by ARVCOP using both the Civilian-Miniature Integrated GPS/INS Tactical System (C-MIGITS III) and the Tactical Navigation Digital Compass System (TACNAV) as input devices. Typically, multiple images are used to determine reference frame biases between the camera and the inertial measurement unit but, in this case, multiple image solutions lacked convergence. For this analysis, single-image solutions were generated. Figure 9, which shows locations of virtual and real objects, is an example of an image used in this analysis. The reference location of the virtual object is the bottom of the green post. The real-object coordinates input to ARVCOP were generated using a GPS survey and have centimeter-level accuracy. Figure 9 illustrates the inaccuracies in the system. During this calibration test, the augmentation showed errors of about 100 mrad (6 degrees) in the display of the virtual objects. (Authors’ note: This paragraph accurately reflects system performance on that day three years ago. Shortly after the test, system modifications were made that eliminated much of that error.)

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 9. ARVCOP image captured showing virtual and real objects.

Test Results

Evaluation of the AAV operation using ARVCOP as a driver’s aid was done by comparing the planned route with the actual route driven. The comparisons were made by finding the distance normal to the route, input to ARVCOP, and the vehicle’s estimated positions, generated using a GPS-relative positioning technique; no vehicle heading information was used and only horizontal components were compared. These differences between planned and executed routes are the CTEs. As mentioned earlier, both the C-MIGITS III and the TACNAV were used as input to ARVCOP for these tests. Figure 10 shows an example of the raw data, with the ARVCOP planned route (blue) overlaid with the GPS estimated positions (red). In this example, ARVCOP used C-MIGITS III heading input updated at a 10-Hz rate.

Figure 10 illustrates how the AAV stayed on the planned course, showing only small deviations. The blue line represents the planned route and the red points are the GPS-estimated positions.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 10. AAV planned and actual route, Run 2.

When TACNAV was employed to supply heading information, similar results were seen. Figure 11 shows the first run made with TACNAV heading estimates. The AAV stayed on planned route except for some minor deviations.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 11. AAV planned and actual route, Run 5.

Figure 12 is of the second run using TACNAV heading information. In this instance, larger and more frequent excursions from the planned route are shown. The differences between Figures 11 and 12 are the result of the driver’s interpretation of the ARVCOP display. When the TACNAV was used as input to ARVCOP, the driver’s display showed greater instability than when the C-MIGITS III was used. The instability was a 1-Hz, few-degree shift in augmentation on the video corresponding to the TACNAV input rate. Figure 12 shows the result of the driver trying to follow all the augmentation shifts. When the driver ignored the sudden shifts in augmentation and drove a perceived average route, the resulting track was smoother, as Figure 11 shows. The 1-Hz input rate and the inherent TACNAV variations both contributed to the augmentation’s jumpy appearance.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 12. AAV planned and actual route, Run 6.

Figure 13 shows the tracks of all the runs from the February 2009 tests that used the C-MIGITS III, except for runs 7 and 8. Run 7 was excluded because high surf caused its early termination when the vehicle was ordered to shore by the safety officer. The driver’s display was lost during Run 8 because of a loose cable and the test was aborted.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Figure 13. AAV planned and actual route, C-MIGITS-III heading data.

Photo: Thomas Zysk, Jeffory Luce, and James Cunningham

Table 1 (left) shows the CTE statistics for the C-MIGITS–III runs. Table 2 (right) shows the CTE statistics for the TACNAV runs.

Table 1 shows the CTE statistics for the C-MIGITS–III runs. Table 2 shows the CTE statistics for the TACNAV runs. Average speed over the course varied from 4 to 5 knots. It took about 15 minutes to drive the entire route.

Discussion

Comparison of the heading estimates between the C-MIGITS III and the TACNAV estimates showed variations of about 3 to 5 degrees, after removal of a bias. Investigation of the relationship of the heading angle error with the heading angle showed that after TACNAV calibration, significant heading error correlations remained in its estimates. Using the TACNAV as a source of heading information showed that the slower 1-Hz update rate and inherent variations of the sensor degraded the augmentation software’s performance. For example, when using the TACNAV, the augmented lane boundaries occasionally jumped a few degrees corresponding to the receipt of heading estimate updates. This was particularly evident after vehicle turns. The C-MIGITS III 10-Hz update rate and higher accuracy estimates enabled ARVCOP augmentation without distracting artifacts and provided the driver with more accurate navigation information. The ARVCOP-augmented objects were drawn on the video with a heading accuracy of about 6 degrees.

During February 2009 R&D tests, the AAV made eight surf runs using ARVCOP with C-MIGITS III input and two runs using TACNAV input. CTE statistics for the ARVCOP C-MIGITS-III testing showed rms differences of about 2.9 meters. The ARVCOP TACNAV testing showed larger rms differences of about 4.9 meters. These statistics represent the rms error between the AAV’s planned and executed route.

Summary

AR technology provides a human-machine interface for a navigation system enabling precise maneuvering. ARVCOP presents navigation data so intuitively that operators are able to multitask as required in mission performance while still being able to precisely maneuver. ARVCOP proved the concept of AR-based precise navigation in rigorous operational scenarios with the U.S. Marine Corps (USMC).

Test results for the R&D commercially available civilian GPS/INS hardware provided CTE of mean 2.1 meters and standard deviation of 2.0 meters. Operational hardware was evaluated in July 2009 over four days of testing, including 47 runs, in conditions with sea states ranging between 1 and 2.5, and many drivers. In 2010, at NSWCDD and Naval Surface Warfare Center, Panama City Division (NSWCPC), land demonstrations were performed with similar hardware navigating cleared paths through simulated mine fields at night. Vehicles were able to transit cleared routes with no external markings. The Naval Sea Systems Command Program Manager, (PMS 495), Mine Warfare Office, is now installing ARVCOP on USMC AAVs.

Acknowledgments

This work was sponsored by Brian Almquist, program officer, Ocean Battlespace Sensing Science and Technology Department, Office of Naval Research. LtCol Brian Seiffert, USMC, acting director of the Amphibious Vehicle Test Branch (AVTB), Camp Pendleton, supported the demonstration. GySgt Chapa and SSgt Schaefer, USMC, coordinated the AVTB effort. Kennard Watson, NSWCPC, coordinated the Camp Pendleton test plan. William Chambers, Maritime Technology Consulting LLC, Udayan Bhapkar, Andrew Sutter, and Alan Evans, NSWCDD, supported the tests and evaluations. Ronald Paradis, KVH Industries, Inc., supported heading sensor calibration.

Manufacturers

The C-MIGITS III is made by Systron Donner Inertial Division (www.systron.com) and TACNAV by KVH Industries (www.kvh.com).


Tom Zysk (captain, U.S. Navy, retired) has more than 35 years of experience in the Department of Defense and industry. He held positions with Raytheon and General Dynamics before joining Technology Systems Inc.

Jeffory Luce is a senior program manager at Technology Systems, Inc. (TSI). As lead for the ARVCOP program, he successfully transitioned TSI’s first project to a Program of Record.

James Cunningham has worked in GPS research and development at the Naval Surface Warfare Center, Dahlgren Division, for more than 25 years

This article is tagged with , and posted in Defense