Autonomous warriors may dominate the battlefield of tomorrow, but even those that still require human flesh will take on a robotic sheen. That shift could start with the end of windows.
This, at least, is what Raytheon is proposing for its contribution to Darpa’s new Ground X Vehicle Technologies program, an effort to improve of future tanks, fighting vehicles, and transports. Darpa hopes smart new tech will obviate the need for increasingly heavy armor by making vehicles harder to spot, catch, and kill.
Ditching windows is a natural move: you eliminate a key vulnerability in both structural strength and crew protection. Problem is, you have to figure out how the folks inside the vehicle will know what’s going on around them.
While a simple external camera feeding an internal LCD “window” could do the trick—like in one supersonic plane concept—Raytheon thinks it can deliver a whole lot more.
One of eight university and corporate R&D centers awarded contracts for this Darpa program, Raytheon wants to build a complete digital replica of the outside world for the soldiers inside. This will involve external cameras, including ultra-HD and 360-degree systems, but the key hardware will be a lidar laser scanner to create detailed, maneuverable models of surrounding buildings, terrain, vehicles, and people, with the video imagery used mostly to provide the model with the real-world visuals.
“Because we’re not locked into a specific view, as you are with a conventional window, crew members can see something of interest, lock the camera on it, and track it,” says David Diller, senior scientist and group lead with Raytheon’s Immersive Training Technologies group. They can measure distances, change perspective—the camera position, that is—and analyze the environment with other sensors, such as infrared.”
The view can even be custom tailored for different crew members. The driver sees around traditional blind spots, while the vehicle commander might have a broader ability to analyze and manipulate the view, folding in additional external perspectives from drones or other vehicles. And of course, it would all be recorded for post-game analysis.
The challenge here mostly comes down to software: developing a computer system that can render these environments fast enough for a moving vehicle in a high-speed, hostile environment. “They need to be able to actually drive a vehicle off these digital models,” Diller says. “We’re pulling in 700,000 LIDAR points per second and generating 3.5 gigabytes of video data per second. We’re optimizing the data processing and the graphics processing to make it feasible.”
If they can pull it off, the human experience would be much like a video game. “When we’re no longer constricted by a window, we can look at a scene in a whole new light,” says Brian Krisler, a Raytheon scientist. Many of the members of his R&D team already have graphics backgrounds. One visual they’re developing includes an inset mini-map with an overhead view and a cone showing the field-of-view displayed in the main image—something any halfway-serious gamer would recognize.
At this point, Raytheon isn’t developing the actual display apparatus for the system—it’s focusing on the root capability—but Krisler says it could be an LCD screen or be goggle-based. Nor is the team working on how the technology can be hardened and preserved if things go wrong, like if the camera or LIDAR gets shot off or covered in mud. Those more practical problems will wait until the core concept proves itself viable and the military picks up the tech. If that day ever comes, Raytheon could finally shut the door on the old-fashioned glass window.