The cyber supply chain attack that infiltrated a software company used by top federal and corporate institutions is just a preview of larger risks that lie ahead. A certain level of system vulnerability is unavoidable in a world where the government must assume that private sector certifications consistently meet the security standards that adequately protect some of our most valued information.
Experts say the problem, at its heart, is a matter of time and trust.
“How do you know what’s going on in your computer?” asks Herbert Lin, leading cybersecurity expert at Stanford University. “The answer is you don’t, you just trust that the thing works.”
To some extent, that’s what the federal government does.
Lin, who has held various cyber-related academic positions, and served on former President Barack Obama‘s Commission on Enhancing National Cybersecurity, now holds a dual role as senior research scholar at the Center for International Security and Cooperation and a fellow at the Hoover Institution’s Cyber Policy and Security at Stanford University.
He has decades of experience in defending against unwanted online intrusions. In light of the possibly unprecedented hack that hit SolarWinds, however, Lin is particularly concerned about vulnerabilities as they relate to software supply chains, because of the unique risk they pose to nearly every level of society — including the U.S. government.
Lin told Newsweek that it boils down to a simple fact of modern life: “Nobody builds a computer from scratch anymore.”
Among the countless points of transit from libraries of raw source code to the finished product operating on some of the most important desks across the globe, a quiet infection gone undetected anywhere along the way could entail a widespread compromise of catastrophic proportions.
That is exactly what happened in the case of SolarWinds.
While officials and experts are scrambling to investigate the degree to which malware may have allowed illicit access to public and private sector secrets, what is known is that for months — potentially since March — up to tens of thousands of networks were infiltrated by a cybersecurity operation whose very definition and danger are unknown to much of the population.
Josephine Wolff, professor of Cybersecurity Policy at The Fletcher School at Tufts University, explained why even well-equipped institutions fail to guard against supply chain attacks.
“Supply chain attacks are dangerous because they exploit organizations’ trust in their vendors,” Wolff told Newsweek, “and parts of their organization that they do not have direct exposure to monitoring and protecting because they are being carried out by third parties who are part of the supply chain, rather than in-house.”
That makes supply chain attacks an exceedingly challenging threat, especially to a government that relies on such commercial software for security.
“[They] can be very difficult to detect, and require tremendous scrutiny and diligence on the part of the supply chain entities, otherwise the targets may not be aware of them until much later,” Wolff said. “In this case, the breach is particularly damaging because a single compromise granted such broad and high-level access to so many high-value public and private sector organizations through their shared supply chain vendor, SolarWinds.”
But SolarWinds is just the beginning.
The Department of Homeland Security’s Cybersecurity and Infrastructure Agency revealed Thursday that it “has evidence of additional initial access vectors, other than the SolarWinds Orion platform,” though these remain under investigation.
Those familiar with how the hack appears to have played out described SolarWinds, specifically an update to the Orion IT product, as an initial access point, a door, for the adversaries to gain entry into the networks of clients that place trust in their certified tools. While the exact point of access has not yet been determined, news that the update server’s password was “solarwinds123” did not inspire confidence among experts and laymen alike.
Access to this server would allow a particularly tech-savvy actor to quietly upload a spoofed update that walks and talks just like the real thing, but includes hidden malware. Such updates are vetted for functionality, but are not screened for internal intruders.
Once inside the target system, the possibilities for further and widespread malign activities are nearly endless. Experts can only speculate as to what the intruders could do, ranging from broad espionage to remote code execution granting command and control capabilities against infected servers.
“Current security info efforts evaluate by looking from the outside looking in,” a cybersecurity expert who requested anonymity because they were not authorized to speak with the media told Newsweek. “This changes the game because you have to go from the inside looking out, because nobody threatens models you already trust.”
But the answer isn’t to forgo patches that provide vital updates addressing existing vulnerabilities.
The extreme alternative would be something another cybersecurity expert with two decades of experience termed “zero trust,” in which every patch, every update is finely reviewed in a manner that would prove challenging if not impossible in a modern-day setting.
“That is the mechanism to guard against this,” the expert said, “very difficult and costly and very hard to do, because it would require you to check everything.”
Another expert, a cybersecurity executive, said that conducting increased scrutiny of every patch would create a new host of problems, not the least of which is increased system vulnerability.
“There are a couple of solutions. First you can hire an army of network engineers — this doesn’t scale — or you can allow time to do a full review of each security patch in the supply chain,” the executive said. “Now the patch that is being introduced has to be viewed as a potential threat and reviewed, you have to accept the lag time, but both come with a premium.”
Such a reform is difficult, perhaps even impossible, in the contemporary competitive atmosphere in which U.S. public and private entities operate.
“I don’t have the time to vet patches in my workflow,” the executive said. “We don’t question a patch from a company that has met every security requirement along the way. Ninety-nine percent of federal agencies and corporate America don’t have the ability to even change that model.”
The malware that got through to SolarWinds was so sophisticated that it knew not to activate when it was in a sandbox, or testing, mode, and only awoke after 14 days of dormancy once in production mode, Newsweek learned from officials.
This is why the supposedly stringent security measures that SolarWinds incorporated into its operation in 2017 in order to earn an Evaluation Assurance Level (EAL) 2 rating (based on a scale of one to four, with one being the highest) failed to break the chain of infection that brought digitally signed malware to customers.
“In this case, the attacker was able to subvert the software build system, software creation system,” Art Manion, a senior member of the Vulnerability Analysis team in the CERT Program at Carnegie Mellon University’s Software Engineering Institute, told Newsweek, “and the malicious software was still signed, so a normal defense mechanism of digital signature was not effective in this case.”
Other defense mechanisms involve costly micromanagement measures akin to the “zero trust” model” that also aren’t always best suited to the rapidly transforming environment in which these systems operate. And ultimately, the value of a target like SolarWinds will always be a liability in security terms.
“The very hard problem is the defender has to defend every possible entry point,” Manion told Newsweek. “A highly resourced and skilled adversary just needs to find one way.”
And that’s exactly the kind of adversary the U.S. faces.
“There is a class of adversary that is either nation-state, part of nation-state government, or it is funded by nation-states, is well-funded, well-resourced,” Manion said, “so [it has] skills, time, the number of people, if necessary, perhaps good planning and patience.”
Manion has worked on a number of projects designed to help close the gaps left by software supply chains. One such endeavor would seek to do so through the establishment of a “Software Bill of Materials” (SBOM), which would provide a transparent inventory of exactly what goes into software — a list of ingredients, essentially, that can be reviewed by all stakeholders involved.
“If we know what software is in our software, many other things are now enabled or unlocked,” Manion said. “So we can play better defense, we can better manage licenses, all sorts of things happen.”
The initiative is being conducted by the National Telecommunications and Information Administration, part of the Commerce Department, which was affected by the SolarWinds hack. But if policymakers remain complacent, they should expect to see more breaches on exponentially larger scales.
“We will definitely see more of this,” Manion said. “It’s the nature of complicated software these days. This whole supply chain is everywhere, it’s unavoidable, and it’s going to be the target for adversaries.”
As to reported indications of Moscow’s involvement, the Russian embassy in Washington has refuted claims of what it deemed “unfounded attempts of the U.S. media to blame Russia for hacker attacks on U.S. governmental bodies.”
“We declare responsibly: malicious activities in the information space contradict the principles of the Russian foreign policy, national interests and our understanding of interstate relations,” the embassy said in a statement reiterated to Newsweek on Tuesday. “Russia does not conduct offensive operations in the cyber domain.”
Regardless of who was behind the hack, Lin also warned of potentially worse things to come if the industry didn’t change course in its ravenous approach toward building bigger, better systems with little regard for all the ways in which they can malfunction or be manipulated.
“We seem to want our computer systems to do more and more, we want our computer systems to sing and dance and play, to have pretty pictures, to have sound, to have touch interfaces,” he told Newsweek. “But inevitably, that makes computer systems more complicated, more complex. You want to do more, you have to do more programming, you have to make the programs bigger.”
This deep-seated addiction to convenience comes with a price.
“The more complicated things are, the more things that go wrong,” Lin said. “Now the bad guy can get in and more places, and he can do more mischief.”
“The only way in the long run to deal with this is to moderate your appetite, and say, ‘I’m willing to put some limits on my appetite for functionality because of the security problems that it creates,'” Lin said.
But that’s easier said than done, he said.
“As to how you do that?” Lin asked. “That’s a judgment question. But at least somebody needs to be thinking about that and, in fact, nobody thinks about that.”