I’m thinking of renaming this “Mariam
Baksh’s Blog”, since once again the NextGov reporter has written a
really interesting story
on cyber goings-on in Washington (my last post
was based on one of her articles, and I’ve written a few others before that).
Mariam isn’t the kind of reporter whose goal is to write the best story about something
that everybody else is also writing about. Instead, she is constantly digging
up interesting stories where I would have never even dreamed to look.
The story this time is an outgrowth
of the Colonial Pipeline attack. Once the TSA issued a cyber directive to the
pipeline industry a few months after the attack, most reporters assumed the problems
were solved. After all, you solve cyber problems with regulations, right?
I might have assumed the same
thing, except a kind person sent me a redacted[i] copy of the directive,
which I wrote about in this
post. What did I think of the directive? Well, I thought the font it was
written in was attractive, but it was all downhill from there. However, I have
found a great use for the directive: In February, I’ve been invited to speak on
cyber regulation for a seminar at Case Western Reserve University. At first, I
was going to talk about lessons learned from various compliance regimes: NERC
CIP (which I know most about), PCI, HIPAA, CMMC, etc. But after reviewing my
post on the TSA directive, I realized I hit the gold mine in that one: Just
about everything the TSA could have done wrong, they did. They should stick to
telling people to take their shoes off in airports. It’s the perfect
pedagogical example of how not to develop cybersecurity regulations!
And maybe they will, since Mariam’s
article points out that Rep. Bobby Rush[ii] of Illinois has
introduced legislation that would take the job of cyber regulation for
pipelines away from the TSA and vest it in a new “Energy Product Reliability
Organization”. This would be modeled on the North American Electric Reliability
Corporation (NERC), which develops and audits reliability standards for the
North American electric power industry, under the supervision of the Federal
Energy Regulatory Commission (FERC). NERC is referred to as the “Electric Reliability
Organization” in the Electric Power Act of 2005, which set up this unusual regulatory
structure.
The best known of NERC’s standards
are the 12 cybersecurity standards in the CIP family. And clearly, these
standards have a good reputation on Capitol Hill. This was reinforced by FERC Chairman
Richard Glick’s testimony at a hearing on Wednesday, when he was asked by Rep. Frank
Pallone, “…do you think that the industry led stakeholder process established
by Chairman Rush's legislation would likewise be a successful mechanism for
protecting the reliability of the oil and gas infrastructure?” Glick replied, “I
believe so. The electricity model has worked very well … and I believe a
similar model will work with pipeline reliability.”
I don’t deny that the NERC CIP
standards have made the North American electric grid much more secure than it
would be without the standards. On the other hand, there are some serious
problems with the CIP compliance regime, which I wouldn’t want to see replicated
for the pipeline industry:
1.
The standards should
be risk-based, like CIP-013, CIP-012, CIP-003 R2, CIP-010 R4 and CIP-011 R1. This
means they should not prescribe particular actions. Instead, they should require
the entity to develop a plan to manage the risks posed by a certain set of
threats - e.g. supply chain cyber threats or threats due to software vulnerabilities.
Then the entity needs to implement that plan. In drawing up the plan, it should
be up to the entity to decide the best way to manage the risks, but there will
be a single set of approved guidelines for what should be in the plan (something
that is missing with CIP-013). Prescriptive requirements, like CIP-007 R2 and
CIP-010 R1, are tremendously expensive to comply with, relative to the risk that
is mitigated. Explicitly risk-based requirements are much more efficient, and
are probably more effective, since the entity doesn’t have to spend so much
money and time on activities that do very little to improve security.
2.
Auditing should be
based on how well the plan followed the guidelines. Of course, this isn’t the
up-or-down, black-or-white criterion that some people (including some NERC CIP
auditors, although I believe that sort of thinking is disappearing, thank
goodness) think should be the basis for all auditing. If an entity has missed
something in the guidelines, but it seems to be an honest mistake, the auditor
should work with them to correct the problem (in fact, the auditor should work
with them in advance to make sure the plan is good to begin with. This is
currently not officially allowed under NERC, due to the supposed risk to “auditor
independence”, a term that’s found nowhere in the NERC Rules of Procedure or
GAGAS).
3.
In other words,
auditors should be partners. They actually are partners nowadays, but when they
do this, they’re officially violating the rules – and note that they’ll still
never write down any compliance advice they may give and they’ll always say it’s
their personal opinion - meaning you can’t count on the next auditor saying the
same thing).
4.
Auditing should also
be based on how well the plan was implemented. This is where I think auditing actually
should be black & white. Once the entity has created the plan, they need to
follow it. If they decide something needs to be changed, they should change the
plan and document why they made the change. But they shouldn’t just deviate
from the plan as it’s currently written (I believe this is how CIP-013 is
audited, as far as I know. The entity can make a change to the plan whenever they
want, but they need to document the change, and then follow it).
5.
Identification of new risks
to address in the standards needs to be divorced from the standards development
process. When a new area of risk is identified as important, entities should
immediately be required to develop a plan to mitigate those risks and follow
that plan – this shouldn’t wait literally years for a new standard to be developed
and implemented.
6.
NERC standards
development proceeds in geologic time – and because of that, a number of
important areas of cyber risk have never been addressed by CIP, since nobody
wants to go through the process of developing a new standard. For example,
where are the CIP standards that address ransomware, phishing, and APTs? These risks
have been around for at least a decade, yet a new standard has never even been
proposed for any of them, let alone written. And how long does it take for a
new standard to appear after the risk first appears? The risk caused by use of “visiting”
laptops on corporate networks has been well known since the late 1990s. When
did a requirement for “Transient Cyber Assets” take affect? 2017.
7.
There needs to be a
representative body – with representatives from industry stakeholders, the
regulators, and perhaps even Congress and the general public – that meets maybe
twice a year to identify important new risks that have arisen, as well as to
identify risks that are no longer serious. If the body decides a new risk needs
to be addressed, a new standard should be created, the mechanics of which would
be exactly the same as in the other standards. Only the name of the risk and
the guidelines for the plan would differ from one standard to another. So no
new requirements need to be developed.
8.
The unit of compliance
for the standards shouldn’t be a device (physical or virtual) – specifically, a
Cyber Asset, as it is in the current CIP standards. Instead, it should be a
system. As I have pointed
out several times before, putting BES Cyber Systems in the cloud (e.g.
outsourced SCADA) is now impossible, since doing so would put the entity in violation
of twenty or so CIP requirements – simply because they would never be able to
furnish the evidence required for compliance. Basing the standards on systems
(and a BCS is defined just as an aggregation of devices, nothing more) would
allow cloud-based systems, containers, and more to be in scope for CIP.
So I wish the new Energy Product
Reliability Organization good luck. I think having stakeholder involvement is
crucial to having successful cyber standards for critical infrastructure. But don’t
spoil it all by not taking account of the lessons learned from the NERC CIP
experience.
Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC. Nor are they necessarily shared by CISA’s Software Component Transparency Initiative, for which I volunteer as co-leader of the Energy SBOM Proof of Concept. If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com.
[i] Making the directive secret was ridiculous, and almost by itself guaranteed it would fail, as I discussed in my post on the directive. But just for good measure (and to make sure that there was no possibility at all that the directive would succeed), the TSA violated just about every other principle of good cybersecurity standards development. They're nothing if not thorough!
[ii] Rush
just recently announced his retirement, after 30 years representing Chicago’s 1st
Congressional District in the House. He served well and happens to be the only
person who ever beat Barack Obama in an election (Obama challenged him for his
House seat in 2000). He was a co-founder (with Fred Hampton) of the Illinois branch
of the Black Panther party in 1968.
No comments:
Post a Comment