ebook img

DTIC ADA500179: Using Augmented Reality to Enhance Fire Support Team Training PDF

0.69 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview DTIC ADA500179: Using Augmented Reality to Enhance Fire Support Team Training

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Using Augmented Reality to Enhance Fire Support Team Training Dennis G. Brown1 Yohan Baillot2 Dr. Michael P. Bailey3 K.C. Pfluger4 [email protected] [email protected] [email protected] [email protected] Paul Maassel4 Justin Thomas4 Dr. Simon J. Julier2 [email protected] [email protected] [email protected] 1Advanced Information Technology, Naval Research Laboratory, Washington, DC 20375 2ITT Advanced Engineering and Sciences, Alexandria, VA 22303 3USMC Combat Development Center, Quantico, VA 22134 4ReallaeR, LLC, Port Republic, MD 20676 ABSTRACT Live fire training keeps warfighting capabilities at peak effectiveness. However, providing realistic targets for live fire exercises is prohibitively expensive. The United States Marine Corps uses a variety of target proxies in live fire exercises, such as derelict vehicles or piles of waste, which are non-reactive and stay in fixed locations. Augmented Reality (AR) can provide realistic, animated, and reactive virtual targets, as well as special effects such as explosions, for real world training exercises with no significant changes to the current training procedure. As part of USMC Fire Support Team (FiST) training, trainees learn how to call for fire as forward observers (FO). The FO determines the location of a target and calls for fire. After the round is fired, an instructor determines the effect on the target, and the FO adjusts. Initial FiST training takes place on a scale model firing range using pneumatic mortars, which is where we inserted an AR system. Our system provides a head-mounted display for the forward observer and a touch screen for the instructor, each showing virtual targets on the real range. The observer can see a simulated magnified view and reticule to determine target identity and location. The instructor controls the targets through a simple interface. The FO calls for fire and a real round is fired. The instructor sees where the round lands in the augmented touch screen view and designates the effect on the target. The forward observer sees that effect and adjusts. The system was demonstrated at Marine Corps Base Quantico in October 2004, where it was well received by mortar trainees and instructors. The system can also show virtual terrain and control measures. Future plans include testing at a full-scale live fire range like Twentynine Palms and completing a Semi-Automated Forces (SAF) interface for more intelligent targets. ABOUT THE AUTHORS DENNIS G. BROWN is a Computer Scientist at the Naval Research Laboratory. He received his B.A. in Computer Science from Rice University and his M.S. in Computer Science from the University of North Carolina at Chapel Hill. His primary research interests are virtual and augmented reality for operations and training. YOHAN BAILLOT is a computer and electrical engineer at ITT Industries. He worked on the Battlefield Augmented Reality System at the Naval Research Lab for over five years. He received an M.S. in electrical engineering in 1996 from ISIM, France, and an M.S. in computer science in 1999 from the University of Central Florida. His research interests are in computer graphics, 3D displays, tracking, vision, mobile augmented reality and wearable computers. Baillot is a member of the IEEE Computer Society. 2005 Paper No. 2216 Page 1 of 8 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 3. DATES COVERED 2005 2. REPORT TYPE 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Using Augmented Reality to Enhance Fire Support Team Training 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Advanced Information Technology,Naval Research REPORT NUMBER Laboratory,Washington,DC,20375 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 2005 Interservice/Industry Training, Simulation, and Education Conference, Orlando, FL, November 28 - December 1, 2005 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF ABSTRACT OF PAGES RESPONSIBLE PERSON a. REPORT b. ABSTRACT c. THIS PAGE Same as 8 unclassified unclassified unclassified Report (SAR) Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 DR. MICHAEL P. BAILEY graduated from the University of North Carolina at Chapel Hill with a Ph.D. in Operations Research in 1988, and became an Assistant Professor of Operations Research at the Naval Postgraduate School in Monterey, California. He was promoted to Associate Professor in 1993 and tenured in 1994. In 1995, he sabbaticaled at the Office of the Chief of Naval Operations, Assessments Division, OPNAV-N81 as a visiting scholar. There he served as operations analyst in support of the Quadrennial Defense Review until 1997, whereupon he joined the Marine Corps as Principal Analyst, Modeling and Simulation. In December 1999, he joined the Marine Corps’ Training and Education Command as Technical Director. In December 2000, the Marine Corps formed the TECOM Technology Division, with Dr. Bailey as its head. In 2005, Dr. Bailey rejoined MCCDC as the Deputy Director for Studies and Analysis, returning to his roots as an Operations Analyst. PAUL MAASSEL has provided systems engineering support for modeling, simulation, and virtual world construction for the past fifteen years. Mr. Maassel was a civil servant at the Naval Aviation Maintenance Office, Naval Air Test Center, and Naval Air Warfare Center Aircraft Division where he took delivery of first beta release of ModSAF for the U.S. government. He served as Systems Engineer on a number of major M&S programs including Synthetic Theater of War (STOW) and Joint Countermine Operational Simulation (JCOS). Mr. Maassel currently manages Reallaer, LLC, a small business working to develop practical augmented reality systems for training and operations. K.C. PFLUGER (Major, USMCR): Subject matter and technical expert on tactics, techniques, procedures, and training for infantry units. He is a graduate of the Naval Academy and has 10 years experience as an Infantry Officer. He has spent the last 5 years as the Marine Liaison to NRL and supporting developing technologies for the Fleet Marine Force. He is a Technology Transfer Specialist for Reallaer LLC. JUSTIN THOMAS has worked as a systems engineer on complex simulation systems for 10 years. He supported the NAVAIR Manned Flight Simulator at Patuxent River, MD for most of that time, where he led several significant projects including the Deployable Virtual Training Environment (DVTE) and the F/A-18 Part Task Trainer. Mr. Thomas is currently the Chief Engineer for Reallaer, LLC, a small business working to develop practical augmented reality systems for training and operations. SIMON J. JULIER is a Research Scientist for ITT Industries at the Naval Research Laboratory. He received a D.Phil. from the Robotics Research Group, Oxford University, UK. He was a technical lead on the Battlefield Augmented Reality System (BARS) project. His research interests include mobile augmented reality and large-scale distributed data fusion. 2005 Paper No. 2216 Page 2 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Using Augmented Reality to Enhance Fire Support Team Training Dennis G. Brown1 Yohan Baillot2 Dr. Michael Bailey3 K.C. Pfluger4 [email protected] [email protected] [email protected] [email protected] Paul Maassel4 Justin Thomas4 Dr. Simon J. Julier2 [email protected] [email protected] [email protected] 1Advanced Information Technology, Naval Research Laboratory, Washington, DC 20375 2ITT Advanced Engineering and Sciences, Alexandria, VA 22303 3USMC Combat Development Center, Quantico, VA 22134 4ReallaeR, LLC, Port Republic, MD 20676 INTRODUCTION 1995). Live entities are real people and vehicles participating in a training exercise; virtual entities are Live fire training keeps warfighting capabilities at peak human-controlled players in virtual worlds; and effectiveness. However, the cost of procuring real constructive entities are driven by algorithms in targets—only to be destroyed—is prohibitively computer simulations. AR provides a natural way for expensive. The United States Marine Corps uses a all three types to mix together. Live entities observe variety of target proxies, such as derelict vehicles, piles virtual and constructive entities through the AR of waste, and even “pop up targets,” all of which are system. Interactions such as the user’s movements and non-reactive, stay in fixed locations from year-to-year, weapon usage are conveyed from the AR system back and often do not resemble the real targets. Trainees to the constructive and virtual simulation systems. Fire simply do not get the opportunity to fire live rounds at Support Team Training is a prime venue to insert realistic-looking and moving targets. However, virtual and constructive entities to combine with live Augmented Reality (AR) can help by merging virtual fires. entities with the real world for training exercises. In this article, we describe an AR system that provides virtual targets for training of USMC Fire Support Teams. Augmented Reality In an AR system, the user wears a tracked see-through head-mounted display with stereo headphones that is connected to a computer containing a database of spatial information related to the venue of the training exercise. By measuring the user’s position and view direction in the real world, three dimensional computer graphics and spatially located sounds are displayed to appear to exist in the real world. A miniaturized and ruggedized computer, batteries, and wireless networking make the AR system man portable (Julier et al 2000). Figure 1 shows a mobile AR prototype system. In the case of AR for training, the virtual information overlay consists of realistic three- dimensional renderings of entities: individual combatants, tanks, planes, ships, and so on. Entities in Training Simulations Figure 1. A Wearable Augmented Reality System Entities in training exercises fall into one of three categories: live, virtual, and constructive (USDoD 2005 Paper No. 2216 Page 3 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Fire Support Team (FiST) Training that was several years from being fielded. The primary roadblocks are tracking accuracy for mobile The USMC’s Fire Support Team training begins with applications and field readiness of wearable computers small-scale (1:40) pneumatic mortars on a 50m x 75m that are powerful enough to drive AR. With the FiST field at Quantico, simulating a 2km x 3km area of training application, we considered a problem for operation. The purpose of this training is to hone the which a fieldable system can be built with existing communication skills between the forward observer components, that would benefit from the real-time and the Fire Direction Center (FDC). In the current combination of live, virtual, and constructive forces, training plan, a forward observer visually locates and that no one else has yet addressed. targets, identifies and determines grid coordinates using binoculars and a map, and recommends a call for APPLICATION OF AUGMENTED REALITY fire to the FDC. Once the shots are fired, the training FOR FIRE SUPPORT TEAM TRAINING instructor (not a part of the operational fire support team) determines the accuracy of the shots and the One of the goals of this undertaking was that the AR effect on the target: catastrophic hit, mobility hit, or no system should support the current training paradigm. effect. The calls for fire are adjusted until the team has The purpose of the first stage is to hone the desired effect on the target. Before introducing the communication skills and not train for absolute AR system, the team fired upon static and unrealistic accuracy in call-for-fire. Therefore, the instructor has proxy targets made of discarded boxes, tubes, and toy the final authority over the success or failure of any tanks. particular mortar firing. For example, the instructor may have a trainee repeat a fire, even if it was a direct RELATED WORK hit, to reiterate the communications skills learned. Our application of AR to LVC training is not the first, Integration of AR into the Training Plan and others who have developed AR training systems should be acknowledged. One early effort (Barrilleaux The AR system, based on the Battlefield Augmented 1999), sponsored by US Army STRICOM in 1993, Reality System (Livingston et. al. 2002), consists of combined live tanks with manned simulators and two stations networked together: a head-mounted computer-generated forces. The system was display for the forward observer and a touch screen demonstrated in Fort Knox, KY. The tanks equipped display for the instructor. Each shows the same set of with limited AR displays to display the virtual and virtual targets superimposed on the real range. The constructive forces in the world and with observer station simulates a view through a pair of instrumentation to send telemetric data back to allow binoculars and can provide a magnified view representation in the virtual and constructive (including a reticle) to determine target identity and simulators. grid coordinates. The instructor station uses a camera in a fixed location to provide an overall view of the More recently, US Army STRICOM created a program range. The instructor can start and stop the movement called Embedded Training for Dismounted Soldiers of targets and determine the effect of a fire through a (ETDS) (Dumanoir et. al. 2002). One of the focus simple menu system and directly selecting objects on areas of this program was to use wearable computers to the display. provide AR- and VR-based training in the field, yielding the MARCETE system (Kirkley et. al. 2002) The order of events is illustrated in Figure 2. First, the which integrates an AR system with SCORM datasets, forward observer, wearing the AR HMD, observes and and VICTER (Barham et al. 2002), which was built to identifies a target, and determines its grid coordinates. fit within the limitations of the current Land Warrior Figure 3 shows a typical view of the virtual targets system (Natick Soldier Center 2001), replacing pieces overlaid on the real world. Next, the observer calls for of that system as necessary. fire, reporting the target to the Fire Direction Center (FDC); in the training, the instructor also plays the role Our own previous work, partially funded through the of the FDC. The FDC sends an order to the mortar ETDS program, includes a system for Military operator, who fires a real (pneumatic) round at the Operations in Urban Terrain (MOUT) training, training area. The instructor looks at where the round allowing a dismounted trainee to navigate a building landed in the real field and on the augmented display, and see and engage virtual and constructive enemy as shown in Figure 4. The rounds are hard to see after forces in the real world (Brown 2004). That goal was landing on the field, so an assistant marks the round very ambitious and yielded a proof-of-concept system with a pole. The instructor makes a judgment call 2005 Paper No. 2216 Page 4 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Figure 2. Order of events using the AR system in training Figure 3. An augmented view of the training area. Figure 4. An assistant marks where round landed. Figure 5. The real round was determined to have Figure 6. A zooming feature allows one to identify destroyed the virtual target. the target. 2005 Paper No. 2216 Page 5 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 about the effect on the target and can designate that on the touchscreen display—Figure 5 shows a direct hit on the target. Finally, the observer immediately sees the designated effect on the target in the HMD, and can even zoom in on the virtual targets for a closer look, as shown in Figure 6 (the video background is replaced by a solid green background due to technical limitations with the hardware in use at the time; future versions of this system will scale the video as well). System Component Description The observer wears a helmet-mounted HMD, as seen in Figure 7, to provide a tracked, augmented view of the training area. This HMD is connected to a laptop computer that drives the visuals. The HMD contains cameras just ahead of the user’s eyes that collect video Figure 8. The instructor’s station. to be augmented. On the rear of the HMD (not visible in Figure 7) sits a rear-facing camera used for high- from a menu and touches the target directly. This precision video-based tracking—this camera captures station also has a fixed camera with a wide field of images of a set of graphical markers placed behind the view that collects video from the training area and user and calculates the position and orientation of the sends it to the computer to perform the augmented user’s head. With the high-precision tracking, the user overlay. Figure 8 shows the instructor’s station as used can look all around the training area and the virtual in the demonstration (as well as the laptop used to targets appear to remain fixed in the real world. The drive the observer's display and some extra equipment user interface consists of just three operations: used for post-demonstration testing and evaluation). controlling the zoom level, turning the reticle on and Figure 9 shows the instructor designating an effect on a off, and turning a virtual grid on and off. The virtual target using the augmented touchscreen display. observer’s portion of the system is simple to operate and allows the trainee to concentrate on the task and For this demonstration, all equipment was loaded onto not the equipment. a handcart and powered by large batteries. We chose this path to keep the demonstration running all day and to accommodate a lot of attendees wanting to try the system—it’s a lot easier to put on a helmet than an entire wearable backpack. The observer’s training system can easily run on the wearable backpack shown earlier in Figure 1, while the instructor station can be “compacted” onto a single tablet PC with an attached camera. Figure 7. The trainee wears a head-mounted display to see the virtual targets. The instructor uses a station with a large, bright touch screen attached to a laptop computer. The instructor can start and stop the virtual targets, designate effects on the targets, and reset the simulation, through a few options on the touch screen display. Again, the focus was on simplicity: when the instructor wants to Figure 9. The instructor designates an effect on a designate an effect on a target, he selects the effect virtual target. 2005 Paper No. 2216 Page 6 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Software Description capabilities, we placed a shipping crate in the field and added it to the AR occlusion model. Figure 10 shows a As mentioned previously, the software was based on virtual tank (drawn at the same 1:40 scale explained the Battlefield Augmented Reality System developed previously) properly occluded by the real-world box. in our lab. The BARS libraries were used to provide One can easily imagine this box is a real building, and the augmented reality core components including this concept can be extended to full-scale hilly or tracker drivers, display calibration, and video overlay. mountainous training areas by creating an occlusion Through a dynamic shared database in BARS (Brown model from DTED or similar data. et. al. 2004), the virtual targets, controlled by the instructor’s computer, are also updated in real time on CONCLUSION the observer’s computer. Similarly, it is this mechanism by which the observer sees the effect on Augmented Reality was inserted into the training plan the target determined by the instructor. with no significant changes to the duties and actions of the participants, except that they can now fire on For this particular application, we added a few new moving targets. The virtual targets for training were features to enhance the training experience: well received by the mortar trainees and instructors at • Virtual grid: The observer can turn a virtual Quantico. One USMC captain said: grid on and off. This grid is drawn on the The Marine Corps will always rely on live training ground plane and is spaced at simulated 1km as the cornerstone for preparing ourselves, but simulation and this type of augmented reality will intervals (25m actual). help make training more effective and more • Terrain: The observer can turn virtual towns, realistic to live combat. As we look to develop roads, and other artifacts on and off. requirement documents for range instrumentation • Zoom: As mentioned previously, the observer and improved MOUT facilities, AR will be a can zoom in on the virtual targets. technology that we incorporate into the • Reticle: Also as mentioned previously, the appropriate aspect of the training facilities. observer can turn a virtual reticle on and off to However, rigorous studies and measurements of more accurately determine the location of a effectiveness are yet to be done. The system can also target. If the reticle is turned on while insert virtual terrain and control measures into the zooming, it is automatically scaled to fit the display, and both capabilities were preliminarily tested screen and allow the observer to accurately at Quantico. Future plans include refining the system, calculate angles. using multiple and/or pan-tilt-zoom cameras, implementing the system at a full-scale live fire range One of the primary features of training in the real such as Twentynine Palms, and completing a Semi- world using augmented reality is the ability to model Automated Forces (SAF) interface for more intelligent the real world training area and properly occlude targets. virtual entities as they move through the environment. In the case of this particular training area, because it REFERENCES was a flat field, there were no significant terrain features to model. To demonstrate occlusion Barham, P., B. Plamondon, P. Dumanoir, & P. Garrity (2002). “VICTER: An Embedded Virtual Simulation System for Land Warrior.” Proceedings of the 23rd Army Science Conference, Orlando, FL, USA. Barrilleaux, J. (1999). Experiences and Observations in Applying Augmented Reality to Live Training. Retrieved June 10, 2005 from http://www.jmbaai.com/vwsim99/vwsim99.html . Brown, D., Y. Baillot, S.J. Julier, P. Maassel, D. Armoza, & L.J. Rosenblum (2004). Building a Mobile Augmented Reality System for Embedded Training: Lessons Learned. Proceedings of the 2004 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC 2004), Figure 10. A real-world object (blue box) Orlando, Florida, December 2004. occludes a virtual tank and truck. 2005 Paper No. 2216 Page 7 of 8 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2005 Brown, D., S.J. Julier, Y. Baillot, M.A. Livingston, & Proceedings of the 23rd Army Science Conference, L.J. Rosenblum (2004). Event-Based Data Orlando, FL, USA. Distribution for Mobile Augmented Reality and Livingston, M.A., L.J. Rosenblum, S.J. Julier, D. Virtual Environments. Presence: Teleoperators Brown, Y. Baillot, J.E. Swan, .J.L. Gabbard, & D. and Virtual Environments, 13 (2). Hix, (2002). An Augmented Reality System for Dumanoir, P., P. Garrity, V. Lowe, & B. Witmer Military Operations in Urban Terrain. Proceedings (2002). “Embedded Training for Dismounted of the 2002 Interservice/Industry Training, Soldiers (ETDS),” Proceedings of the 2002 Simulation, and Education Conference (I/ITSEC Interservice/Industry Training, Simulation, and 2002), Orlando, Florida, December 2002. Education Conference, Orlando, FL, USA. Natick Soldier Center (2001). Operational Julier, S.J., Y. Baillot, M. Lanzagorta, D. Brown, & Requirements Document for Land Warrior. L.J. Rosenblum (2000). BARS: Battlefield Retrieved June 6, 2003, from Augmented Reality System. Proceedings of the http://www.natick.army.mil/soldier/WSIT/LW_O NATO Information Systems Technology Panel RD.PDF Symposium on New Information Processing US Department of Defense (1995). Directive 5000.59- Techniques for Military Systems, Istanbul, Turkey, P, Modeling and Simulation (M&S) Master Plan. October 2000. Retrieved June 9, 2005 from Kirkley, S., J. Kirkley, S.C. Borland, T. Waite, P. http://www.dtic.mil/whs/directives/corres/pdf/500 Dumanior, P. Garrity, & B. Witmer (2002). 059p_1095/p500059p.pdf . “Embedded Training with Mobile AR,” 2005 Paper No. 2216 Page 8 of 8

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.