Tuesday, June 29, 2021

It turns out SBOMs are more “required” than I thought

Last Friday, NIST published the definition of “critical software” that the May 12 Executive Order required them to develop. I confess that I looked through it briefly, satisfied myself that NIST hadn’t taken the very helpful advice I’d provided them, and put it aside – promising to myself that I’d write a post this week complaining about how nobody ever listens to me.

However, this morning Miriam Baksh, NextGov’s excellent cybersecurity reporter, published a good article that made me go back and look at NIST’s definition. I realized that it actually does at least pay attention to the two concerns I raised in my post linked above – although I sincerely doubt this was because of that post (I also did submit comments to NIST on this subject, although I don’t think they were the reason, either).

But what really caught my eye was something I hadn’t noticed when I skimmed through the definition on Friday. The definition reads

EO-critical software is defined as any software that has, or has direct software dependencies upon, one or more components with at least one of these attributes:

  • is designed to run with elevated privilege or manage privileges;
  • has direct or privileged access to networking or computing resources;
  • is designed to control access to data or operational technology;
  • performs a function critical to trust; or,
  • operates outside of normal trust boundaries with privileged access.

As far as the bullet points go, they more or less reproduce what the EO “suggested” should be addressed in the definition of critical software (i.e. the EO essentially said “NIST, you’re free to develop any definition you want, as long as it’s this one…”). But then I mentally applied my sentence diagramming skills (which I excelled at in Mrs. Llewellyn’s third grade class, I’ll have you know...) to the first two lines and realized how much NIST’s definition depends on the idea of components. Here’s what I mean:

If for the moment we drop the clause about dependencies, the definition reads “any software that has… one or more components with at least one of these attributes…” Why do the words “that has one or more components” have to be there? Why doesn’t it just say “software that has one of these attributes”? After all, the EO’s “definition” of critical software just refers to the attributes of the software itself, not of its components.

Yet NIST seems to be saying that it’s really the components of the software that contain the attributes, not the software itself. Or more exactly, they’re saying that the software is nothing but its components.

It’s true that the average software product contains lots of components. Most components are written by third parties, but the “glue” that holds them all together is written by the actual supplier (i.e. the one whose name is on the product that you buy). However, that glue can really be thought of as just other components – except these are components written by the supplier. This means the product itself is literally just a collection of components; NIST seems to take that attitude here.

The phrase “or has direct software dependencies on…” seems to drive this point home. The link (NIST’s, not mine) points to the FAQ that came with the definition, which says

  1. What do you mean by “direct software dependencies” in the definition?

For a given component or product, we mean other software components (e.g., libraries, packages, modules) that are directly integrated into, and necessary for operation of, the software instance in question. This is not a systems definition of dependencies and does not include the interfaces and services of what are otherwise independent products.

This drives home the point that software consists of components, mostly written by third parties but some written by the supplier themselves. But by saying that software is just components, NIST is saying that all software risks reside in components. Ergo, managing software supply chain risk means managing risk for each component of the software – whether written by the supplier themselves or by a third party.

And how will you, Mr/Ms Software User, find out what components are in your software, so you can manage risks in those components? You got it…you need an SBOM!

The bottom line is that it appears NIST has expanded the SBOM requirement in the EO. The EO requires software suppliers to provide an SBOM to government customers, when the software meets the definition of “critical software”. However, NIST is saying that the source of risk is really the components of critical software, not the software itself.

This means that government agencies aren’t even going to be able to completely figure out their software risks without having an SBOM for each product that they use. It also means that the final determination of whether a software product is critical or not will require having a current SBOM for it.

Will SBOM’s be widely available for most software products by the time all of this comes into effect – in 1-2 years? No. So it’s likely that software risk analysis will continue to be done mostly based on identification of vulnerabilities found in the product itself, not in its third-party components. But will this accelerate the need for SBOMs to become widely available? Absolutely. There’s work to be done.

Speaking of which: Tomorrow you can join the energy SBOM proof of concept, as we learn from people in the healthcare industry that have been working on their PoC since 2018. They have a lot of great lessons to teach us! 

Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC. Nor are they shared by the National Technology and Information Administration’s Software Component Transparency Initiative, for which I volunteer. If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com.

 

No comments:

Post a Comment