ShiftLeft Academy

Shifting Left on Software Product Liability

Posted on

by

Experts Michel Genard and Chris Hughes say shifting left on cybersecurity is a good thing, but liability will be hard to enforce.

By Deb Radcliff

The White House National Cybersecurity Strategy is full of language around holding technology product vendors liable for cyber incidents that impact the safety of the nation’s critical infrastructure. As described by the Cybersecurity and Infrastructure Security Agency (CISA), there are 16 critical infrastructure sectors “whose assets, systems, and networks, whether physical or virtual, are considered so vital to the United States that their incapacitation or destruction would have a debilitating effect on security, national economic security, national public health or safety, or any combination thereof.”

The cyber safety of our critical infrastructure has been a cause for concern dating back to President Clinton’s term with Presidential Decision Directive 63 (published in 1998) that called for public-private Information Sharing and Analysis Centers (ISACs) to share threat intelligence that could impact infrastructure sectors. These early calls for action put the onus on the infrastructure companies, while the latest White House strategy is shifting some of the responsibility and much of the liability to software product vendors. 

This includes “expanding the use of minimum cybersecurity requirements in critical sectors to ensure national security …” (section 1.1), working “with Congress and the private sector to develop legislation establishing liability for software products and services …” (section 3.3), and building resilient systems … “including by reducing systemic technical vulnerabilities in the foundation of the Internet and across the digital ecosystem …” (section 4.1).

“No one would think of purchasing a car today that did not have seatbelts or airbags included as a standard feature, nor would anyone accept paying extra to have these basic security elements installed. Unfortunately, the same cannot be said for the technology that underpins our very way of life,” said CISA director Jen Easterly during a February speech at Carnegie Mellon University titled “Unsafe at Any CPU Speed: The Designed-in Dangers of Technology and What We Can Do About It.” 

A Good Thing Overall

Shifting security and responsibility left to the product development design stage is a good thing, but the details are in the implementation, starting with design and carrying through the entire lifecycle of the product, says Michel Genard, Chief Strategy and Product Officer at Lynx Software Technologies, which develops software platforms used in avionics, defense ground systems, satellites, and autonomous systems. 

“Customers use our software to develop critical applications deployed in critical infrastructures, so product failure is not an option. Our clients need guarantees that the system is going to behave the way it should regardless of environmental factors and that resiliency is part of the DNA of the system,” he said during our recent Zoom interview. A security strategy starts at architecture, he continued, “So think carefully about how to architect a system for safety and security. It must be extremely sound and have the best guarantee of safety and security.”

Safety regulations for the automotive industry, for example, should be translated over to software engineering and product development, Genard continued, adding. “You wouldn’t drive without using seatbelts. Nor should buyers of products have to worry if their software is safe to use or not.” 

But as Easterly said in her statements to CMU, “We find ourselves blaming the user for unsafe technology.” Easterly followed up by stating that “strong security should be a standard feature of virtually every technology product, and especially those that support the critical infrastructure that Americans rely on daily.”

Yet security is pushed onto users, especially enterprise users, often at exorbitant rates, and that’s not likely to change soon. 

“Security, like safety, costs money. So, the question will be, how do you include the cost of security in your value chain? Is the customer willing to pay more for security?” Genard asks. That is how it has been so far because of speed to market and other factors, he added. “Slowing down and absorbing the costs for better security could become a competitive disadvantage, or it could become a competitive advantage. Net-net, security should be looked at from the business, consumer, regulation, education, research, and training perspectives.” 

These elements are covered in the national cybersecurity strategy, along with incentivizing R&D and creating grant programs to help cover the costs product companies are certain to incur. 

What Classifies as “Critical Infrastructure” Software?

Chris Hughes, CISO and Co-Founder at cybersecurity services firm Aquia, brought up the question of how to classify products as critical infrastructure. Lynx is clearly in that realm, but what about ubiquitous applications that every organization uses, critical sector or not? 

“Commercial software like Exchange from Microsoft, Slack, and other shared apps are everywhere. And a big case to consider is also open source software, which is pervasive in all commercial apps used in every subsector of the critical infrastructure,” he noted. In 2020, a Synopsis study reported that 99 percent of commercial applications contained open-source code and that 91 percent of 1,250 commercial applications audited contained outdated and abandoned open-source code. While the study didn’t say what percent of these applications were running in critical infrastructure companies, it’s easy to extrapolate that open-source code touches all industries. 

SBOMs (Software Bills of Materials) also play a role in the White House Strategy, and Hughes thinks that the strategy will likely enhance the use of SBOMs once the issue of competing standards (SPDX vs CycloneDX) is worked out. The CISA is also addressing this. In conjunction with the Department of Energy, it recently released the SBOM sharing lifecycle guidelines for disseminating and receiving SBOM reports and findings, surveying SBOM findings, and more.

As Easterly said in her Carnegie Mellon address, ”The government has an important role to play in both incentivizing these outcomes and operationalizing these principals. Regulation—which played a significant role in improving the safety of automobiles—is one tool, but—importantly—it’s not a panacea.”

Regulations, Best Practices, and Safe Harbor

While the strategy talks about streamlining regulations, establishing best practices, and safe harbor, the framework currently lacks teeth at the software development layer. Hughes notes that regulations aren’t there yet and likely won’t be for a long time if you think about how there is still no federal privacy policy.

“If regulations came forward saying that infrastructure product companies must produce software that aligns with NIST SSDF and leverage practices from OWASP SAMM and others, I wonder if that would be enough to protect product companies from legal liability,” he said. The key is when that might happen—and how. “Is the federal Government going to set those requirements? If not, then states like New York and California will.”

In her address to Carnegie Mellon, Easterly told the audience that “Achieving this outcome [of secure-by-design] will require a significant shift in how technology is produced, including the code used to develop software. But ultimately, such a transition to secure-by-default and secure-by-design products will help both organizations and technology providers: it will mean less time fixing problems, more time focusing on innovation and growth, and importantly, it will make life much harder for our adversaries.”

Related Posts

Check out all of GrammaTech’s resources and stay informed.

view all posts

Contact Us

Get a personally guided tour of our solution offerings. 

Contact US