Walter Haydock continues to write
great articles about vulnerability management. His latest is a case in point. The post is ostensibly about the
National Defense Authorization Act (NDAA) for 2023 (which passed the House a
month or two ago), but he provides good advice for any organization that’s
concerned about software vulnerabilities, which should be just about every
organization on the planet.
The provision of the Act that he
focuses on specifies contract language for DHS, including…get ready for it… that
the vendor must provide “a planned bill of materials when submitting a bid
proposal” and “A certification that each item listed on the submitted bill of
materials is free from all known vulnerabilities or defects affecting the
security of the end product or service…”
Of course! After all, what could
be simpler? All the supplier has to do is certify that not a single third party
component in their product – and none of the “first party” code written by the
supplier itself – contains any vulnerability at all (and if the product
includes thousands of components – still no problem!). Moreover, this applies
to all vulnerabilities, regardless of whether the CVSS score is 0 or 10 or
the EPSS (exploitability) score is 0 or 1. It will be easy as pie for any
supplier to make this certification; they just have to write “I so certify” (or
words to that effect) on a piece of paper or digital document and send it in. But
making it truthfully? Ahh, that will be harder...
Furthermore, Walter points out there’s
no provision for monitoring software for new vulnerabilities after it’s
installed. He says “…this provision appears to over-index on the (perceived)
security of a piece of software at a single point in time, without any concerns
about what happens after the software goes into operation.” What? Do you mean to
tell me that the number of possible software vulnerabilities wasn’t fixed when
Alan Turing first described (unintentionally) the idea of programmable
computers in his great 1936 paper, “On computable numbers…”
– despite the fact that the “computer” he described required an infinite paper
tape and would never finish a calculation? Walter, why didn’t you tell the House
committee that drafted the bill that new software vulnerabilities are
identified every day, if not every hour?
Of course, this isn’t a mere
omission. Whoever drafted this provision of the Act obviously didn’t understand
that, far from having a fixed security posture from birth, software develops
vulnerabilities all the time, as researchers (and attackers, who are also
performing “research”) discover snippets of code that used to be considered
benign, but now…aren’t. In 2021, 20,000 new CVEs were identified. That works
out to two per hour.
After all, that’s what
vulnerability management is about: learning about newly identified
vulnerabilities that apply to the software your organization utilizes and
taking steps to patch or otherwise mitigate the small minority of vulnerabilities
that you determine pose a risk to your organization, while ignoring the overwhelming
majority that don’t.
Walter continues, “There appears (in
the required contract language) to be no ability on the part of the vendor
to accept risk stemming from software vulnerabilities, even when providing a
justification to the government. The vendor can only ‘mitigate, repair, or
resolve’ such issues.” Frankly, I can see some misguided corporate lawyer
putting together contractual requirements like these. But someone working for
Congress should presumably have access to the best cybersecurity advice
available. Didn’t they even think to ask someone knowledgeable whether what
they were writing made sense? Guess not.
However, I disagree with Walter in
the last part of his post. He says:
At a minimum, this legislation will
make it basically mandatory to use machine-readable Vulnerability
Exploitability eXchange (VEX) reports about issues identified in software
products used by the government. Any non-automated process would break down easily
and likely run afoul of the law’s requirements. Without the widespread use
of such standards and tools, software bills of materials (SBOMs) are not likely to see
significant adoption, due to the fact that vendors will be flooded with
inquiries about false positive vulnerabilities in their products. Thus, this
bill may help to motivate the introduction of new techniques for communicating
about the exploitability of vulnerabilities in software while increasing
transparency for the entire ecosystem.
I approve of his obvious wish to
see SBOMs adopted in significant numbers, as well as his observation that this
will never happen unless there are also VEX documents available to warn users
about the huge percentage of non-exploitable component vulnerabilities (my
analogy is that, if SBOMs are the wheels of software supply chain cybersecurity,
VEXes are the lubricant. The SBOM wheels will stop turning very quickly if not
lubricated with VEXes). And I agree with him that it would be nice if this bill
(it won’t be a law until the Senate approves it, of course) led to wide use of
VEXes, which might then make SBOMs widely used as well.
However, I really don’t think this
small provision of the NDAA will ever get implemented, even if it passes the
Senate in its current form (and I hope it gets removed by the Senate. Saying
nothing at all about the subject of software vulnerabilities would be much
better than requiring compliance with nonsensical rules). But even if it does
pass, I predict it won’t be implemented in practice, even if it is nominally a
law.
Why do I say this? It’s because I’ve
seen previously how a regulation[i] can literally not make
sense but still be implemented – and still be “complied with”. Would you like
to know how I think this happened? ...I didn’t think so, but I’ll tell you
anyway. It happened because the regulators and the entities regulated developed
a tacit understanding that the standards were basically unambiguous, and
therefore compliance could be objectively verified. This is a variation of the “Don’t
ask, don’t tell” policy that the Clinton administration came up with to address
another problem having to do with the military.[ii]
Works like a charm. You ought to
try it. The only problem is it's corrosive to the idea that laws are supposed to be complied with as written. However, if you don't care about the rule of law, then you should support this provision in the bill!
Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC. If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com.
[i] NERC CIP version 5. If you search on that phrase in my blog, I’m sure you’ll find it mentioned in at least 400 of the (as of now) 938 posts I’ve published.
[ii] The
Soviet Union was the showplace for this sort of thing, since almost no law or
regulation could be complied with as written. There was a saying among the
workers, “They pretend to pay us, and we pretend to work.”
No comments:
Post a Comment