A couple
weeks ago, Tom Hofstetter of NERC sent around a link to the Supply Chain
Working Group of the NERC CIPC. The link was to a story
about a lawsuit filed by Delta Airlines, against [24]7.ai, Inc, the supplier of
the chatbot software on their (and other companies’) web site. They blame that
software for a 2018 breach (which was immediately reported by Delta) that
resulted in the loss of the credit card information of 800,000 to 850,000 Delta
customers. At around the same time, Sears had reported a breach of the same
software on their web site.
The 2018
stories on this breach focused on the consequences of the breach, but didn’t
say anything about how it came about. That’s understandable, since nothing had
been publicly revealed about that. But Delta’s lawsuit makes it clear that the
cause of the breach was a classic supply chain attack: The software supplier’s
development process was compromised, and a back door was inserted. Delta’s
lawsuit makes the case that it was the vendor’s poor security practices (which
belied the assertions it had made to Delta in their proposal) that led to the
breach.
I’ll bet a
lot of the people who read this post are currently working (or will be soon) on
their CIP-013 supply chain cyber security risk management plan. While this
breach obviously doesn’t involve the BES or an electric utility, it still
furnishes a good illustration of how your plan can help you mitigate real
supply chain risk.
CIP-013 R1
has two parts. R1.1 requires the entity to develop a plan to “identify and
assess” supply chain cyber risks (unfortunately, the drafting team left off the
word “mitigate”, but that is also clearly required. CIP-013 makes no sense –
and FERC would never have ordered it in the first place – if all it required
was just identifying and assessing risks, but not doing anything about them).
R1.2 lists six objectives the entity has to meet in their plan. These are
mitigations for six (actually eight) specific risks. While the NERC entity can
choose whether to mitigate
or accept the risks it identifies in R1.1, in R1.2 there is no choice – the
entity has to mitigate all of these risks (although they’re given lots of
leeway for deciding how to do that).
Since none
of the parts of R1.2 deals with software development, this is a risk that
should be considered under R1.1. Again, R1.1 requires you to “identify, assess
(and mitigate)” supply chain cyber risks. There’s only one problem with this
wording: There are probably a huge (if not infinite) number of such risks. Do
you need to identify, assess and mitigate all of them?
If you
happen to have an infinite budget for supply chain cyber risk management at
your disposal, I do recommend that you try to mitigate all of these risks,
although given the relatively short human lifespan, you may want to make
provision for a successor, a successor to that successor, etc. However, there’s
no real problem with doing that, since you do
after all have an infinite budget.
But let’s
say you live in the real world and you don’t have an infinite budget – in fact,
it’s probably a lot less than you wish you had. Given that it’s unlikely you’ll
be able to get more (at least for a year or so), you should try to make sure
that you spend your resources – dollars and time – as effectively as possible.
And what does “effectively” mean in this case? It means you mitigate the most
risk possible, given your available resources. How do you do that? The three
steps are identify, assess and mitigate.
Identifying Risks
First, you
identify risks that you believe are significant. And what’s a significant risk?
One that you think has some chance of actually being realized. Consider the
risk that a hurricane will flood a Medium impact substation, and in the process
destroy a BES Cyber System that has been in service for years and for which the
supplier has gone out of business – leaving you in a big pickle for finding
replacements.
Is this a
significant risk? If the substation is in the middle of the desert in Arizona,
probably not. But if it’s near the South Carolina coast, it probably is
significant. So a utility in Arizona won’t even consider this risk, while one
in South Carolina probably will. Since both utilities – in this illustration –
have to comply with CIP-013, they will both look for risks they consider
significant and put those on their list. It would be good if CIP-013 itself provided
a list of significant supply chain security risks to the BES, but it doesn’t
(mainly due to the fact that FERC
gave NERC just one year to develop the standard, get it approved, and send
it to them). However, NATF last week released a set of “Criteria” for BCS
vendors, which I think at least provides a good starting point for vendor risks
(you can get it by going here).
Note that vendor risks aren’t the complete universe of supply chain risks; you
also need to identify risks that come through your own organization (e.g. the
risk that you would buy a counterfeit BCS component that would contain a
vulnerability that could be exploited), as well as from external sources such
as the hurricane just discussed.
Assessing Risks
Once you
have a list of significant risks, are you finished? Should you then roll up
your sleeves and start mitigating each one? I don’t recommend it. It’s likely
you’ll have a big list, and mitigating even all of these (for instance, the NATF
document alone lists 68 “supplier criteria”) won’t be easy, especially with
your limited time and budget available for supply chain cyber risk mitigation.
You now need
to assess these risks by assigning each one a risk score, based on its
likelihood and impact. And once you’ve done that, you need to list them all in
a spreadsheet, ranked from highest to lowest risk score. Then you should choose
the highest risks to mitigate. Where do you draw the line? You estimate your
resources and then choose the risks that you have the resources to mitigate.
But it should always be those that pose the highest risk for your entity’s BES assets.
By doing
that, you’ll be assured of mitigating the most possible risk, given your
available resources. You could always choose some lower risks to mitigate and
ignore some higher risks, but this will only assure that you’re not getting the
highest return for your risk mitigation investment. For example, the utility in
Arizona could spend a lot of money protecting their substations against
flooding during hurricanes – but wouldn’t it be better if they chose some risks
that are much more likely to be a problem to them?
So how does
the Delta breach fit in with this schema? Frankly, maybe six months ago I would
have assigned a fairly low likelihood – and therefore a low risk score - to the
risk that someone could penetrate a supplier’s software development process and
insert a backdoor in the product (and I would have done this, knowing full well
about the Juniper
breach of 2015); and I would probably not have objected to a client telling
me they didn’t think this was a risk worth mitigating. I might have even
accepted the argument that surely, after the Juniper breach, software
developers had tightened up their development environment controls, so that it
would be hard for an attack like that to succeed again.
Obviously,
the Delta breach shows that this particular type of supply chain attack is
alive and well. So this may be a risk that many NERC entities will decide they
should mitigate.
Mitigation
And how do
you mitigate a risk like this? You obviously can’t force a supplier to have
good security, but you can certainly ask them to commit to it – whether in an
RFP response, contract language, a letter, an email, or even a verbal
conversation. But that’s never enough. Even though Delta’s supplier committed
in their proposal to having good security practices (in fact, they said they
had them in place already), they just didn’t do it. You will need to regularly
assess the supplier, usually with a questionnaire, but – if you have reason to
suspect they won’t tell you the truth – you might have to do an audit (your
contract should always permit you to do this).
But
sometimes even that isn’t enough. In Delta’s case, the supplier had given then
further assurance of their security (in a GDPR compliance attestation) even
after the contract had been signed – at the same time that they were developing
the software for Delta in an insecure environment! This is why you need backup
mitigations. One would be to do vulnerability assessments on any software or
hardware (firmware) you purchase, before you install it – if you think the risk
justifies doing this (and I know at least one utility that does this for
anything they purchase). Another would be to require – in the contract or RFP –
that the supplier do a vulnerability assessment themselves, before they ship
any product to you. Obviously, Delta’s supplier didn’t know about the backdoor.
Vulnerabilities
For any risk
to be realized (although I prefer the word ‘threat’ here), there must be at
least one vulnerability that enables it to be realized. This means that, in
order to mitigate the risk, all of the vulnerabilities need to be mitigated,
since even one open vulnerability can enable the risk to be realized. Think of
the risk of a thief breaking into your house; your doors and windows are
potential vulnerabilities that will allow that to happen. If you have all of
your doors and windows very well secured except for one door that’s wide open,
it’s likely you will be robbed. You have to mitigate every vulnerability, in
order to mitigate the risk itself.
In the
lawsuit, Delta provided a list of vulnerabilities that the supplier hadn’t
mitigated, which could have allowed the breach to happen: “allowing numerous
employees to utilize the same login credentials; did not limit access to the
source code running the [24/7] chat function to only those individuals who had
a clear need to access that code; did not require the use of passwords that met
PCI DSS…standards; did not have sufficient automatic expiration dates for login
credentials and passwords…; and did not require users to pass multi-factor
authentication prior to being granted access to sensitive source code.” In
other words, these were all open doors that made it much more likely the
supplier’s development process would be breached. These are all items that
should be addressed with the supplier, through contract language, letter, phone
call, email, carrier pigeon, etc. While you should have a general requirement
for a secure software development lifecycle, you should also specifically
require controls like the one Delta lists.
Supplier Incident Response
Perhaps the most outrageous part of the supplier’s behavior
was that it took them five months to notify Delta of the breach, and even then
it was only through sending LinkedIn messages to a few Delta employee contacts.
In fact, it seems that, as of the lawsuit’s filing, the Supplier still hadn’t
given formal notice to Delta. It’s a good idea to require suppliers to create a
Supplier (Vendor) Incident Response Plan. This will detail exactly how a
supplier will handle an incident like this. Of course, you should have the
right to review that plan and suggest changes you think are necessary.
The NERC CIPC Supply
Chain Working Group is putting the final touches on a white paper of
guidelines on vendor incident response plans. I hope it will be posted to the
NERC web site soon, along with five other papers that are developed and just
need final approval.
Any opinions expressed in this blog post are strictly mine
and are not necessarily shared by any of the clients of Tom Alrich LLC.
If you would like to comment on what you have read here, I
would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that
if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or
challenges like what is discussed in this post – especially on compliance with
CIP-013. And Tom continues to offer a free two-hour webinar on CIP-013 to your
organization; the content is now substantially updated based on Tom’s nine
months of experience working with NERC entities to design and implement their
CIP-013 programs. To discuss this, you can email Tom at the same address.
No comments:
Post a Comment