WASHINGTON — Northrop Grumman is developing software it says can simplify the high-stakes process of discovering, classifying and monitoring missile launches across the globe by leaning on pattern-recognition capabilities.

The defense company is in the process of refining what it calls False Track Reduction Using Machine Learning for the U.S. Space Force, with eyes on delivery in early 2025. It is anticipated for use in the Space-Based Infrared System program, or SBIRS, and has potential application in other overhead persistent infrared assignments.

Space Force personnel track thousands of potential missile incidents each month and must contend with false alarms. Increasingly delicate spying technologies, proliferating satellites, ever-evolving weapons and military flare-ups overseas can aggravate the already-complicated process.

Northrop’s offering is designed to ease the information avalanche analysts face by parsing what may not be an actual launch or outbound projectile while, at the same time, ensuring no “real event or real missile” is improperly sorted, according to John Stengel, the director of the company’s mission exploitation enterprise.

“As sensors get better — as sensors in space improve — they get more sensitive. As sensors get more sensitive, the more false tracks we get,” Stengel said in an interview with C4ISRNET. “Having the ability to leverage machine learning to help the human in the loop, so to speak, do his or her job is to become absolutely critical.”

False Track Reduction Using Machine Learning is trained on real-world data and can be amended as foreign militaries advance their respective arsenals. The system uses what Stengel called profiles, or proven characteristics such as speed, shape and altitude, to detect and earmark objects for further inspection by users.

“What the system is going to do is say: ‘Hey, this doesn’t seem like a real missile, but I’m going to present it to the operator, the human in the loop, to make sure and make that decision,’” Stengel said.

“As different countries in the world modify or adjust or come up with new weapon systems, we then have to take those and add them to the training scenarios, so that the system knows about it, has the latest and greatest,” he added. “I’ve never heard of replacing the human in these scenarios. This is all about assisting.”

The Department of Defense has for years considered artificial intelligence and machine learning critical to the speedy sorting of battlefield information. Its implementation is gaining speed and spread; the department is juggling more than 685 AI-related projects, including several tied to major weapons systems, according to the Government Accountability Office.

C4ISRNET reporter Courtney Albon contributed to this article.

Colin Demarest was a reporter at C4ISRNET, where he covered military networks, cyber and IT. Colin had previously covered the Department of Energy and its National Nuclear Security Administration — namely Cold War cleanup and nuclear weapons development — for a daily newspaper in South Carolina. Colin is also an award-winning photographer.

More In AI & ML