DARPA Joins Those Digging Deep Into Software Supply Chain
Supply chain integrity has always been the “madwoman in the attic” of IT security programs – a problem so complicated and devilish and nasty, that nobody wanted to deal with it. Best to stuff it away and try to forget, right?
But when the U.S. Congress starts holding hearings about supply chain security and calling the executives of global technology firms to testify, just pretending that it isn’t a problem you need to worry about won’t fly with your boss – or your board. And, of course, that’s exactly what happened this year, as executives from Chinese network equipment makers Huawei and ZTE appeared before the House Intelligence Committee to answer questions about the security of the routers and switches they sell to U.S. companies and the companies ties to the Chinese military.
The executives’ answers (or lack thereof) didn’t inspire confidence, with the Committee issuing a report that declared both companies evasive and, therefore, a risk to the security of their customers’ networks, data and intellectual property. But where does that report leave organizations that are trying to figure out how to verify the integrity of the software and hardware they purchase? Really no better off than before. The truth is that the technology supply chain is global – whether the U.S. House of Representatives likes it or not. Simply saying “don’t buy Huawei” or -even more bedeviling – “don’t buy Chinese” doesn’t really answer the question about where threats lie in the supply chain and how to identify them.
Microsoft said as much in a document it produced to inform policymakers (PDF) about how to understand and address supply chain risk. I quote: “Government should avoid using simplistic factors such as a product’s country of origin to address risk.” Doing so, the company warned, “could lead to weakening the principles of open trade and relinquishing the benefits of global innovation.”
Fortunately, smart folks in both the public and private sector are starting to look at the issue of supply chain security with an eye to both the complexity of the issue and on practical strategies to address and answer questions on the security of hardware and software acquired from resellers, business partners and third parties. Microsoft, as we mentioned, has taken a lead among private sector firms. In the government space, NIST, the National Institute for Standards and Technology, has conducted workshops and issued a number of reports and guidance (PDF) for organizations that wish to start assessing the security of their supply chain. NIST’s counterpart in Europe, ENISA, has done the same (PDF).
However, there’s still a noticeable gap between recommendations and reality. For example, NIST’s advises organizations to “Create and Maintain the Provenance of Elements, Processes, Tools and Data.” Technology acquirers, integrators, and suppliers should “maintain (the) provenance of elements under their control to understand where the elements have been and who might have had an opportunity to change them.” And, by “provenance,” NIST means determining the “source of goods such as computer hardware to assess if they are genuine or counterfeit.”
– Oh! Well, that’s easy enough, right? Wrong. It all sounds strikingly similar to the principles set forth by SOUP (Software of Unknown Pedigree) and as any engineer will tell you, attesting to the provenance of a hardware or software component is an incredibly difficult task. Routers, switches and CPUs consist of hundreds or thousands of individual components, produced all over the world. The provenance of software and firmware might be easier to verify initially, but it’s harder to pin down in dynamic environments where software and firmware might be periodically updated, patched or altered.
That’s probably why DARPA, the Department of Defense’s think tank, is throwing money and resources at the issue. On November 30, DARPA announced a new project dubbed VET, for Vetting Commodity IT Software and Firmware. The new program is designed to “address the threat of malicious code” and “verify the security and functionality of commodity IT devices” to make sure they’re “free of hidden backdoors and malicious functionality.” This is the kind of stuff that the House Committee was suspicious that Huawei and ZTE might be guilty of – but had no way to prove.
“DoD relies on millions of devices to bring network access and functionality to its users,” said DARPA program manager Tim Fraser in a statement. However, the agency lacks the know-how to rigorously vet the software and firmware in each of those devices. “The perception that this problem is simply unapproachable is widespread,” he said. Accordingly, VET’s main objective is to develop “techniques, tools and demonstrations that will forever change this perception.”
VET is focused on three challenges:
- Defining malice. Given a sample device, how can DoD analysts produce a checklist of software and firmware components to examine – looking for hidden malicious functionality.
- Confirming the absence of malice (aka: proving the negative) – Even if you have a checklist of components that you can prove do exist, how can you prove that malicious features and functions don’t exist?
- Scale: Given that you can do the first two on a discrete device in the lab, how can this procedure scale agency wide, encompassing the work of non-specialist technicians who must vet every individual new device used by DoD prior to deployment?
As with many DARPA projects, VET is open ended – companies interested in proposing solutions can submit their ideas and, on December 12th, DARPA’s hosting a Proposer’s Day in Arlington to brief those interested in participating on the program. And – as with many DARPA programs – this is proof of concept stuff. Whatever will come out of VET – however groundbreaking – will be no closer to your average corporate network than the Agency’s Mach 20 jet is to your average civilian airport. Still – the longest of journeys start with a single step, as the saying goes. And DARPA’s work will be critical when it comes time to connect the dots between official guidance, best practice and your IT department. This is an area to watch in 2013.