Military Goes First

Charlie Fink
13 min readMar 14, 2019

An excerpt from Convergence, How the World Will Be Painted With Data.

By Sam Steinberger

The military’s first practical forays into AR-like technology date back to just before World War II. In less than a century, AR technology has moved from infrared systems to “X-ray” vision.

Night Vision

One of the first forms of AR to be used by the military was night vision. Modern night-vision systems, which allow soldiers to see in light levels approaching complete darkness, really had their start in 1929 when Hungarian physicist Kálmán Tihanyi invented the first infrared-sensitive electronic camera. This classified technology was used by the British to detect enemy aircraft.

Soon after, scientists working in Germany began to explore the use of infrared technology. By the end of WWII, they had developed an infrared weapons system that used a beam of infrared light to illuminate objects. This beam effectively “painted with light” to create visibility through special optics systems sensitive to the infrared spectrum.

These “active infrared” systems were developed for tank crews, as well as for individual soldiers, so they could fight in darkness. However, a significant disadvantage of the active system is that just shining a floodlight in the visible light spectrum would give away one’s position, anyone with an infrared viewing system would also be able to spot and potentially disable an infrared light source.

Meanwhile, the U.S. Army also had secret programs of its own to develop an active infrared system for use by individual soldiers in the 1940s and 1950s.

By the 1960s, the U.S. Army had developed the “starlight scope,” an evolution of the active system and used in the Vietnam War. This scope was a passive system, essentially magnifying the available light, like the moon or stars, in order to make a scene more visible. Although passive systems were less bulky and lighter than the active systems, the rifle mounted scopes that used passive technology were underpowered compared to today’s devices. The fundamentals behind passive infrared systems, however, informed today’s night-vision technology.

The next major development in augmented night vision came in the form of thermal-vision technology, which captured the infrared (thermal) energy emitted by objects. In 1978, microbolometers that measure radiant energy were invented, giving the U.S. military a drastically improved version of thermal imaging. Although the foundation for thermal imaging extends earlier than Tihanyi’s work, microbolometer technology made thermal imaging more portable and realistic for individual use.

Today’s night-vision technology can amplify light by 50,000 times or more, and adaptive technology exists to protect soldiers’ eyes against issues like temporary blindness caused by sudden light exposure. Thermal imaging is now also used for a range of applications from satellites to rifle scopes. Smartphones can even become thermal AR devices, like Caterpillar’s CAT S61, which contains the FLIR Lepton thermal camera embedded in the phone. Given the military’s focus on interoperability, individual AR systems used by soldiers would have to incorporate, or at the very least be compatible with modern thermal and night-vision technology.

HUDs

Modern Heads-Up Displays (HUDs) can be traced to WWII. As pilots struggled to find their targets while over enemy territory, they had to rely on verbal instructions from their crew. Eager to convey similar information by mechanical means, the military developed prototypes to provide pilots with flight information. However, these displays proved to be static and bright, a particular challenge for pilots flying at night.

A solution was developed. By projecting information onto an angled piece of glass or the window of the cockpit, the HUD was able to provide a pilot with radar or targeting details, without the pilot having to divert from looking up and out of the aircraft.

The first HUD in operational service was built by Cintel and eventually acquired by what is today known as BAE Systems. It was used on the maritime strike aircraft, the Blackburn Buccaneer, which entered service trials with the British Royal Navy in 1961. Designed for high-speed, low-level operation over land and sea to carry out split-second attacks, the Buccaneer’s pilot needed aircraft attitude and weapon aiming information in a display that wouldn’t divert the pilot’s gaze.

The modern HUD was born. Later in the 1960s and into the 1970s, it was incorporated into American military aircraft. Iterations of the technology would see further use in commercial aircraft, spacecraft, cars, and even a $400,000 helmet, designed to let Lockheed Martin F-35 Lightning II pilots see through their own aircraft with “X-ray” vision.

HoloLens

For the past three decades, the U.S. military has been on a mission to develop a personalized augmentation system that would assist its soldiers on the battlefield. Back in 1989, the Army demonstrated its Soldier Integrated Protective Ensemble (SIPE), a technology demonstration that brought wearable sensors and enhanced communications systems to individual soldiers.

Although it proved feasibility, the program wasn’t particularly soldier-friendly. Grunts had to haul around a gargantuan thermal sensor device and a clunky head-mounted display (HMD), not to mention carrying around a backpack battery to power it all. Nevertheless, the Army decided the concept was sound, and SIPE gave way to the Land Warrior program. The combination of small arms with high-tech equipment at Land Warrior would lead the military into the 21st century.

The program incorporated electronic systems, like cameras, thermal sights, and laser rangefinders, onto small arms like the M4. The helmet system provided mounts for optics that allowed a soldier to visualize information provided by equipment such as the thermal sight. Soldiers had communications devices and other electronics integrated with their backpacks, along with what was essentially a mousepad on the soldier’s chest. By the time the program was canceled in 2007, systems had decreased from 86 pounds in 1999 to 40 pounds — on top of the approximately 80 pounds of full combat gear already carried by a soldier. Each system cost more than $85,000.

Despite the advances in wearable sensors and the miniaturization that evolved over the decade, Land Warrior at its best was still not the Augmented Reality of science fiction or the Augmented Reality that the Army felt was adequate. Soldiers’ systems incorporated AR elements, like thermal- and night-vision technology, but the military wanted a system that would perform functions like identify friend or foe in the battlefield, seamlessly transition between electro-optical tech and map the soldier to the environment he or she was in. It wanted an Urban Warfare Augmented Reality (UWAR) system, especially as top staff in the Army increasingly feel that the future of war will be fought in dense, complex urban environments.

The Battlefield Augmented Reality System (BARS), initially funded by the U.S. Office of Naval Research, was driven by the goal of creating an infantry system analogous to a pilot’s “Super Cockpit,” according to the U.S. Naval Research Laboratory. Features of BARS included a database that could be updated by all users and commercially-available hardware components. Significant research went into understanding the way AR systems handled things like depth perception, visual occlusion and the visibility of AR displays, information filtering, object selection, collaborating across a shared database, and the requirements of embedded training.

In 2014, the U.S. Army announced the introduction of the ARC4, by Applied Research Associates, Inc. The system attached to a soldier’s helmet and allowed users to map themselves to a battlefield environment. Commanders could provide their soldiers with maps and navigation information, beamed to the soldier’s field of vision. Today, the U.S. Army Communications-Electronics Research, Development, and Engineering Center (CERDEC) is building on the advances made by BARS to develop the Tactical Augmented Reality (TAR) system, an even more futuristic system that will allow soldiers to map themselves to an environment, quickly and easily identify friendlies and targets and provide soldiers easily accessible, real-time battlespace information.

While TAR is still in development, other countries have expressed interest in AR on the battlefield. In 2016, the Israeli army reportedly purchased HoloLens devices, exploring their potential to improve battlefield strategy and training opportunities. That same year, there were also reports of a Ukrainian company named Limpid Armor working with the military in its country to implement the Circular Review System for tanks. A series of cameras attached to a tank would give a commander wearing the HoloLens a 360-degree view around the vehicle. It’s a concept that is already in place with the helmets of pilots flying Lockheed Martin F-35 Lightning II fighters.

Following the launch of Magic Leap’s AR system, the U.S. Army announced an opportunity for AR companies to provide the military with AR systems. And it wasn’t just a device here and an app there. Magic Leap and Microsoft were in the running for a program that could see the military purchase 100,000 AR devices. By the end of 2018, the U.S. Army announced Microsoft would be awarded a $480 million contract and an opportunity to put the company’s new HoloLens to the test.

Both parties will likely walk away with lessons learned. Microsoft will need to make its AR system ready for battle, which will include hardware and software upgrades. Some pundits suggest the HoloLens will be broken into its components and rethought for this rugged application. Soldiers reliant on the technology can’t be slowed down to restart their AR systems or make repairs if the system is exposed to the elements. If all goes well, the contract stipulates that Microsoft will provide the U.S. Army with as many as 100,000 AR devices that will “increase lethality by enhancing the ability to detect, decide and engage before the enemy,” according to the government’s program description.

The U.S. military will face not only computing and hardware challenges, but cybersecurity problems as well. The military’s communications systems will need to operate despite jamming attacks or worse. Enemies that could hack a networked system replicating soldiers’ eyes and ears could decimate an attacking or defending force. As it has for the last few decades, the military will continue to go first in Augmented Reality, pushing the boundaries of what’s possible and opening up new ways of using today’s technology.

Challenges to Overcome

There are plenty of boundaries to push, chief among those are stability and accuracy. “[C]urrent development is far behind the need of urban warfare,” noted researchers in a recently published study looking at the capabilities of AR compared to the demands of UWAR. “The correctness of combat action and the speed of execution, will not only impact the success of a military confrontation but also result in a significant difference in combatants’ survival,” continued the peer-reviewed article, “Survey on Urban Warfare Augmented Reality.”

The military’s version of Hololens or another AR system will be put to the test. Rapid head movements, poor operational conditions, and vibrations inherent in urban combat make registration and stability very challenging. The gyroscopes, GPS antennas, magnetometers and inclinometers, delicate instruments in some cases, will have to withstand extreme temperatures, moisture levels, and impacts. Urban environments have a variety of variables, including distribution of targets and signal-blocking buildings that will have to be overcome through powerful computing and robust networking. The system cannot fail in its registration of all users’ locations, keeping that information continuously and simultaneously up-to-date. And, of course, the system will have to be portable and come with enough battery life to keep it functioning for as long as possible.

The Hololens has good computing horsepower and does a good job of mapping itself to its environment in a non-combat situation, said Dr. Ronald Punako, Jr., who has studied AR and VR technology. The software looks at movement, although he noted a combat situation could present problems for a commercial-grade AR system like an out-of-the-box Hololens system.

The advantages of a well-built, functioning AR system are just as real as the obstacles, however. AR systems outperform more traditional navigation systems, according to researchers. The intuitive nature of AR navigation reduced mistakes made by combat-stressed soldiers and situational information reinforced the situational awareness of fighters. Commanders also improved their operational task planning with the assistance of AR.

Keep Up The Good Work

Then there’s training. VR is further along in its implementation for training, but a robust AR system could also prepare soldiers and address poor combat practices. And if an AR system is developed for combat, it makes sense to use it for training as well.

For now, VR is better suited for training because the hardware has already been established, said Tyler Gates, managing principal at Brightline Interactive, a team of creative technologists whose clients include future-looking government agencies like DARPA.

VR has had a role in training U.S. security forces for over five years. An untethered, free roaming training system, developed by Raytheon Company and Motion Reality Inc., was demonstrated at joint exercises in 2012. The VIRTSIM training supported three squads of soldiers, armed with functional replica weapons. VIRTSIM is now used by soldiers in Malaysia and the United Arab Emirates, in addition to the U.S.

The Automated Serious Game Scenario Generator for Mixed Reality Training (AUGGMED) program in Europe is another example. In early 2018, VR security training in the multi-country program, led by business group BMT, incorporated physical objects and locations into its exercises. AUGGMED was used by port security forces in Greece, who were training for potential terrorism-related threats. The system incorporated a hybrid experience by integrating on-site trainees alongside other trainees working remotely via VR.

AUGGMED has three degrees of training: a limited VR with no mobility or tactile feedback, an immersive VR experience with limited mobility and tactile feedback, and an on-site fully-immersive and mobile experience with locally networked colleagues, allowing for a range of training experiences.

The system transforms a VR headset into a type of full AR/VR device. HTC Vive headsets go one step beyond VR by attaching a pass-through, outward-facing USB 3 camera on the front of the headset, providing a forward-looking perspective. The workaround is needed because devices like the Hololens have a poor field of view and the image quality isn’t the best for trainees, according to Dr. Robert Guest, part of the AUGGMED team. He’s been a leading simulation developer at the University of Birmingham for the past decade.

AR Meet Fashion

One of the more interesting tangential integrations with AR may be fabric. Nanotechnology allows fabrics to be impregnated in such a way that the material reflects certain electromagnetic signatures. This radar- and microwave-absorbing fabric could be integrated with an AR system, allowing soldiers to identify others just by the radar signature of the uniform they’re wearing, helping soldiers avoid friendly-fire incidents and more effectively target enemy forces.

The system could also be used by emergency personnel, where medics might be able to identify and triage injuries using a combination of functional fabric, smart sensors and an AR system, points out Eduardo Siman, a former technology consultant and early investor in Virtualitics. First responders to the scene of an earthquake, for example, might be able to better pinpoint the location of victims. Meanwhile, a project at MIT conceived the idea of using smart belts to passively measure radiation levels, a difference-maker for nuclear technicians.

Sea and Air

Land-based soldiers and commanders aren’t the only military personnel expected to benefit from AR. The U.S. Navy has already tested the Unified Gunnery System Augmented Reality, or GUNNAR, with plans to develop an AR helmet to facilitate better communications between crews aboard America’s fleet. The software helps by issuing “fire” and “cease fire” commands, and provides virtual training. The training alone is a huge cost saving and readiness boost because live rounds aren’t fired, but shipmates are still able to prepare for battle. The software runs on a helmet made by AR hardware developer Daqri.

A future combat system dreamed up by the BAE Systems and the Royal Navy aims to give a lightweight Hololens system to bridge watch officers on warships. Bridge watch officers are responsible for monitoring a number of instruments and readouts to safely guide the passage of the ship. The new system would put that information onto a digital surface visible to the officer. Artificial intelligence would assist the officer in interpreting various readouts, alerting military personnel to potential hazards and creating a more efficient workflow for sailors on the ship.

The BAE Systems technology is advancing rapidly. The rollout, part of a $27 million advanced combat systems upgrade, is slated for testing in 2019 and could be fully integrated into the Royal Navy’s fleet by the end of the year.

The system for the Royal Navy is akin to a lightweight version of one of BAE System’s most advanced AR helmet-mounted HUD systems: the Striker II. Intended for pilots, the full-color system has integrated night-vision and picture-in-picture display capabilities, according to its developer. Similar to what some VR headsets have, the Striker II also has spatial audio technology, or “3D audio.” Spatial audio technology allows the pilot to auditorily pinpoint threats by delivering warning signals to precise areas of the pilots’ earpiece. Despite all the computing going on to deliver sensory input to the pilot, the system has “near zero” latency, according to BAE Systems.

The Royal Australian Air Force expects AR to be an important pillar of its military technology platform. As early as 2016 Saab Australia and the country’s military were looking at ways the Hololens could be useful for strategy, threat management, and training. The country’s army has also looked into the ARC4 system.

Then there’s the AR helmet-mounted HUD system, the Gen III Helmet Mounted Display System, developed for the Lockheed Martin F-35 Lightning II. Developed jointly by Rockwell Collins, Elbit Systems and Lockheed Martin, the helmets, at $400,000 a pop, are custom-made to the head shape of each pilot and include real-time visuals provided by mounted cameras on the aircraft, as well as night vision and thermal imagery. A magnetic field generated by a transmitter in the pilot’s seat allows the system to track the pilot’s head movement.

The Military Leads

The technological innovation improving AR systems in the military shows no signs of slowing down. Complex, functional AR systems might quickly find their way into land-based combat. They’re already seeing limited use in the form of night-vision and thermal-imaging technology on the ground. In the sky, this technology is even more important for pilots. On the sea, they’re slated to be in use on the bridges of the most advanced navies in the world. Fabrics and other peripheral materials, embedded with sensors or nanotechnology, could be integrated with an AR system, creating a world that’s more deeply permeated with data and new insights. For now, the military leads the way.

--

--

Charlie Fink

Consultant, Columnist, Author, Adjunct, Covering AI, XR, Metaverse for Forbes