HANSCOM AIR FORCE BASE, Mass. – System engineers at Hanscom are trying to increase network capacity to the point where no Airman will ever curse the connection or blame the blue circle for work stoppage.
Most Air Force installations, each hosting thousands of users, connect to the outside world through a choke point that resembles single-lane vehicle entry control points on a military base, called a boundary. Traffic piles up, and data is lost during high-traffic times. As a result, the Command, Control, Communications, Intelligence and Networks directorate picked up requirements from Air Combat Command that allowed their personnel to put two bases under a microscope to find out why.
What C3I&N network experts found at Scott Air Force Base, Illinois, and Wright-Patterson AFB, Ohio, convinced them that these choke points needed to add lanes, and network managers needed better visibility on the data traffic ceaselessly streaming between bases. Their government-led effort installed massively more powerful boundary connections linking base networks, and also gave them insight into how data flows over the fence to the internet.
“We put a tap on those systems,” said Jim Pinder, lead engineer for the Air Force Intranet Control weapons system, managed by C3I&N. “Once we were able to see traffic, and capture data, we could analyze it and identify the root problem. It turns out, there is no single root problem, but by creating a more robust and resilient network, we can move forward and fix interfaces up and down the network.”
C3I&N, tasked with sustaining the existing network, is also working to acquire networks as-a-service, from contractors who specialize in setting up secure environments for large companies, banks and hospital systems. While that effort is underway, Pinder and his colleagues are trying to improve the baseline performance of every single Air Force base.
They succeeded in replacing legacy, single-lane connections measured in hundreds of megabytes with multiple, gigabyte-sized connections resembling a highway. In addition to providing a tenfold traffic capacity increase, the systems are also redundant, which achieves “dual path resiliency” for those bases.
Should one connection go down, another can step up to carry the weight of traffic. C3I&N doesn’t have the manpower as a government-led effort to install these freeway connections at every base, but they’ve acquired the knowledge to solicit bids from major network providers.
“We didn’t really have data on network performance until now,” said Pinder. “What we’re seeing is that, yes, the network capacity is a problem. But there are also many other connections between you, the user, and the application or piece of information you are seeking. If we can trace that path, and identify where the connection slows and data is lost or corrupted, we can methodically handle each of those problems. But we need to see it first.”
Another C3I&N initiative to export data to the cloud will eventually cause problems that network experts can already predict, should network managers not improve the boundary. If the lion’s share of a base’s data exists on outside cloud storage and the boundary connection is too slow, then users will have a harder time reliably accessing data they are used to having at their fingertips.
One of the C3I&N PEO’s missions is predicting and solving problems like these, ensuring the architecture of the Air Force network can support today’s improvements and tomorrow’s technology.