Thursday, November 29, 2018

Implicit Requirements: the Remix



In October of 2015, I wrote a post about an RF CIP workshop I had just attended. Of course, Lew Folkerth of RF spoke at the workshop, and I summarized his presentation this way (I’ve edited it a little, not because Lew used language unfit for a family blog like this one, but because I said a few things that don’t make complete sense. Since I’m allegedly three years wiser at this point, I can now see the problems in what I wrote):

“Lew pointed out that there are a number of “implicit requirements” in CIP v5. These are things that the entity needs to do to be in compliance, which are not specifically identified in the actual Requirements. Lew gave the example of the implicit requirement to “(identify) any associated Protected Cyber Assets” (this requirement never appears at all in CIP v5, but the entity obviously needs to identify PCAs, since many of the requirements apply to PCAs). RF isn’t just looking for a list of the PCAs, but wants to know how the entity verified that every Cyber Asset in the ESP had been identified as either a component of a BVS or a PCA.

“Another example is identification of Electronic Access Control or Monitoring Systems (EACMS) and Physical Access Control Systems (PACS). The entity is never explicitly required to identify these, but they obviously have to do so to be in compliance - and they need to be able to show that they haven't missed any EACMS or PACS systems when they did their identifications.

“Of course, you can’t find out about these implicit requirements by reading the RSAWs, since they only deal with the explicitly-stated requirements (2018 note: Tom was being fairly careless here, as he too often is. He should have said something like ‘The RSAWs are constrained not to require anything other than what is in the strict language of the requirement’). A questioner asked Lew if RF would publish a list of the implicit requirements. Lew said he’d look into doing that. I certainly hope he does – it is greatly needed (another 2018 note: Lew would have liked to publish that list, but ended up not being allowed to because – of course! – the ERO, meaning NERC and the Regions, isn’t allowed to provide anything that could be considered an “interpretation” of a NERC requirement. This is a very sorry situation, and has led to repeated false starts by NERC in trying to provide guidance on CIP. The CIP people at NERC would just love to be able to provide CIP guidance, since the need for it has been great all along and continues unabated. But it simply isn’t allowed by the NERC Rules of Procedure, and literally every attempt by NERC staff members to provide guidance on CIP has ended up being withdrawn).

Lew’s presentation was the first time I had really thought about the idea of “implicit requirements”, which is an excellent way to describe a lot of the ambiguities in the CIP v5 requirements (another might be “unstated assumptions”, but I like Lew’s term better because it emphasizes the need for the entity to document how they complied with these implicit requirements – since there is no way that an entity can properly comply with a CIP requirement itself without first complying with any implicit requirements it contains). In requirements developed since v5, this problem hasn’t repeated itself (at least not much). But since most of the v5 requirements are still in effect, we still have to deal with the problem.

I had kind of internalized the idea of implicit requirements and considered it part of the overall CIP landscape, as much as the non-implicit requirements – so I’ll admit I haven’t really thought about it much, and I never wrote a full post on the topic. But, as I mentioned in a post two weeks ago, I’ve recently been working with several entities that are fairly new to CIP compliance, and I realized I needed to initiate them into the dark secrets of implicit requirements – so that they would have a prayer of surviving in the cold world of NERC CIP audits. Here is how I put it to them (OK, I never actually said these exact words, but I consider myself as having said them. And that’s just as good as saying them, right?):

I have the greatest respect for the CIP v5 drafting team – after all, they started working in the fall of 2008 and didn’t finish their work until January of 2013, when they submitted CIP v5 to FERC (along the way, they’d also developed– and gotten approved - CIP versions 2, 3 and 4, along with a “version”  in early 2010 that was too visionary for its time, but ended up mostly being incorporated into v5. That version was known as CIP-010 and CIP-011, but these standards had nothing to do with the current CIP-010 and -011).

However, in retrospect, the team did take some big shortcuts in drafting CIP v5, mainly because they were under huge pressure to get something approved by the NERC ballot body and out the door to FERC in 2012. This pressure was due to FERC’s having put the squeeze on them by rather impulsively approving CIP version 4 in April of 2012 – as discussed in this post (I regard approving CIP v4 as the worst mistake FERC has made regarding the NERC CIP standards. I discussed this in three posts in 2013, starting with this one). The drafting team most likely felt – and perhaps rightly so – that if they didn’t get v5 approved and on FERC’s desk in 2012, FERC was going to let CIP v4 come into effect on its 4/1/14 implementation date. And this would be followed two or three years later by the implementation of CIP v5, meaning NERC entities would have to go through two huge CIP transitions in 2-3 years. So they probably had no good option other than to take some shortcuts.

One of the biggest shortcuts the team took – again, looking back from 2018 - is that they didn’t bother to write explicit requirements for all of the steps that were needed to comply with some of the requirements that they drafted (I certainly didn’t think of this at the time the SDT was developing v5, even though I attended some of the drafting team meetings and participated in a number of the conversations). The effect of this is that a single requirement can contain five or more implicit requirements. You need to comply with all of the implicit requirements in order to comply with the requirement itself – but you need to figure them out entirely on your own, since none of the implicit requirements are actually written down anywhere.

Probably the most egregious example of implicit requirements can be found by examining CIP-002-5.1a R1. The “operational” parts of the requirement are R1.1-R1.3, which all mandate that the entity identify something called BES Cyber Systems. What’s a BES Cyber System? You go to the NERC definition, which just says it’s a bunch of BES Cyber Assets grouped together for “reliability purposes”. And what’s a BES Cyber Asset? That’s a very long definition, but it starts off with the term Cyber Asset. And what’s a Cyber Asset? Well, it’s a “Programmable electronic device...”, according to the NERC definition.

And what’s that? The words “electronic device” certainly have an accepted meaning, but the word “programmable” doesn’t. This leads to a problem that I discussed in a number of posts in 2014 and 2015, starting with this one: that there are a few words that are vital to understanding the CIP requirements, which have no NERC definition and no generally accepted meaning. For each of these words, the only solution – which it took me a long time to understand, although once again Lew Folkerth was ahead of me – is to do your best and come up with a reasonable definition of your own, then follow that consistently as you comply with the CIP standards.

So our attempt to comply with R1 and “identify” BES Cyber Systems has actually led us to three implicit requirements:

R0.1. Develop a definition of “programmable”.
R0.2. Identify Cyber Assets.
R0.4. Identify BES Cyber Assets, among the Cyber Assets identified in R0.1 (you’ll see why I numbered this as I did in a moment).

So are we finished with the implicit requirements in R1? No. Let’s look at the first part of the definition of BES Cyber Asset: “A Cyber Asset that if rendered unavailable, degraded, or misused would, within 15 minutes of its required operation, misoperation, or non-operation, adversely impact one or more Facilities, systems, or equipment, which, if destroyed, degraded, or otherwise rendered unavailable when needed, would affect the reliable operation of the Bulk Electric System.” Do you understand exactly what that means, so you’ll have no problem identifying a BES Cyber Asset if it walks in your door?

You might, but my guess is you’ll end up doing what a lot of people in the industry did: getting hung up on the phrase “…impact one or more Facilities, systems, or equipment…”. Doesn’t that strike you as a little broad? After all, if I go out to a substation and hit a transformer with a hammer, I will have impacted a BES Facility (the transformer meets the NERC definition of Facility). Does that make my hammer a BES Cyber Asset? No it doesn’t, because a BCA has to be a Cyber Asset, meaning it’s programmable. OK, let’s say I take out my cell phone and lightly tap the transformer, leaving a very small dent in it. The phone is definitely a programmable electronic device, and the definition doesn’t say “big impact”, just “impact”. This makes my cell phone a BCA, right? After all, it impacted (dented) a BES Facility within 15 minutes. Of course, the drafting team was thinking of “impact” in the sense of “electrical impact” – but they didn’t include that in the definition. Hence the ambiguity.

So one big question as people were trying to figure out CIP v5 was the question of the meaning of “impact the BES”. They needed this before they could identify BCAs, and they needed to identify BCAs before they could group them into BCS. How did they answer this question? Well, I offered my own helpful opinion on that, but it didn’t exactly gain universal acceptance and shouts of acclamation (I still like it, and in fact I think it is the “definition” that most NERC entities have implicitly used in deciding what’s a BCA).

The fact is that, just as in the question on the meaning of programmable, there is still no answer from NERC on what “impact the BES” means. Both of these questions are officially on the plate of the current CIP Modifications drafting team, but that group isn’t going to directly answer either one of them – I’m 100% sure of that. And I support them 100% in this particular non-action, since there is simply no way to develop a real Webster’s-style definition of either term.

On the other hand, the drafting team will neatly deal with both issues, as part of their proposed changes to deal with virtualization. First, they’re going to neatly bypass altogether the need to keep using “programmable” by relegating the term Cyber Asset to a very unimportant role (essentially, the term will only be used as part of the definition of Transient Cyber Asset. My, how the great have fallen!). Second, they are retiring the term BES Cyber Asset altogether, and the new definition of BES Cyber System incorporates language that avoids the problem the phrase “impact the BES” – in fact, it in some way follows the “definition” I came up with in 2015, although I’m not in any way claiming credit for this. Pretty cool, huh?[i]

All of this means there’s another implicit requirement buried in CIP-002 R1:

R0.3. Develop a definition of “impact the BES”.

So we have, without breaking a sweat, come up with four implicit requirement buried in CIP-002 R1 – and we’re still not done! When we get to CIP-002 Attachment 1 (which of course is part of R1), there are a whole host of implicit requirements (I honestly guess there could be as many as 50 or 100, especially when you start looking at some of the bright-line criteria, which aren’t bright at all but are loaded with unstated assumptions – i.e. implicit requirements. Maybe I’ll take a month off sometime and try to enumerate them all, just out of sheer perversity).

CIP-002 R1 is probably the CIP requirement with the most implicit requirements buried in it. But CIP-005 R1 can certainly give it a run for its money, with maybe CIP-010 R1 easily winning “show”. I will leave those two as an exercise for the reader.

Let’s get back to reality (I enjoy doing that every now and then – breaks my routine). What does the fact that there are so many implicit requirements buried in the “actual” CIP requirements mean for a NERC entity? The main impact (so to speak) is that your RSAW compliance narrative for a particular requirement should go through all of the logical steps that are actually needed to comply with the requirement – if you do that, you’ll “discover” all of the implicit requirements, even though you probably won’t recognize them as such. For example, your CIP-002 R1 narrative could start with a list of the steps we went through above, although probably in reverse order.

And what will happen to you if you don’t do this? Will you get a violation? You definitely can’t get a violation, since implicit requirements aren’t stated in the real requirement. But your auditor isn’t just looking to nail you for violations, but to see if you really understood what you were doing when you complied with a requirement. So if you just say that you “identified” your BES Cyber Systems in your CIP-002 R1 narrative, your auditor might ask “And how did you do that?” If you answer, “I dunno, I thought this computer had a 15-minute BES impact – and besides, I like its metallic gray color”, you won’t get a Potential non-Compliance finding, but your auditor may well give you an Area of Concern, then tell you to think about how you should identify your BCS and re-do the RSAW narrative to include all the steps you need to take (implied or not). Then you’ll need to go back to make sure you actually identified all of your BCS properly.

And if you discover you actually missed one or two BCS? You would now have to comply for them, of course. But I don’t see any way you can get a violation for that, since – of course – implicit requirements aren’t stated in a requirement!


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013; we also work with security product or service vendors that need help articulating their message to the power industry. To discuss this, you can email me at the same address.

[i] The SDT has released for comment a new set of revised standards and definitions that addresses virtualization, and I will discuss these questions more when I discuss that new release.  You can get a preview of what I’ll say from this post.

Tuesday, November 20, 2018

A Great Supply Chain Security Forum



In this post from early October, I mentioned (toward the end of the post) a conference I’d just attended outside of Washington DC, called the Software and Supply Chain Assurance Forum (SSCA for short). It’s a free quarterly two-day conference sponsored by NIST, the Department of Defense, the Department of Homeland Security and the Government Services Agency (GSA). It is organized by MITRE Corporation and is always held at MITRE’s offices in McLean, Virginia. I intended to write a post soon thereafter providing all the details you need to get on their mailing list, both in order to hear about future conferences and for general information about supply chain security.

Well, “soon” has proved to be a month and a half, as I got diverted by other things, especially FERC’s approval of CIP-013. But yesterday I received an email announcing their next forum December 18-19, and I decided I really need to write this post now, since some of you may want to attend it (which I can’t do this time, although I will try to attend as often as possible in the future. BTW, there’s no webcast or recording – you have to attend in person).

I asked Bob Martin of MITRE, one of the organizers, to provide a short description of the forum and its history. To be honest, I thought this was probably some effort that had just started a few years ago, when I first heard a lot of talk about supply chain security. As you’ll see, I was quite wrong on that. Face it: the Feds were doing supply chain security long before a lot of us even heard of it. Here’s what Bob said:

“The Software and Supply Chain Assurance started in 2004 as the Software Assurance Forum and Working Groups, a free, DoD-focused public engagement effort to bring together the community on the myriad issues surrounding Software Assurance. The following year it became a joint effort of DHS and DoD, with a continued focus on engagement with industry, academia, and government, and was re-established as part of the Cross-Sector Cyber Security Working Group (CSCSWG) under auspices of the Critical Infrastructure Partnership Advisory Council (CIPAC).

During the first decade, the Forum had an emphasis on working groups to accumulate and promulgate best practices on the Software Assurance aspects of Technology, Processes, Education, and Acquisition. In 2008, the co-sponsoring partnership expanded to include NIST. Then in 2014, GSA joined as the fourth co-sponsor and the Forum refocused to more directly include Supply Chain issues, including renaming itself the Software and Supply Chain Assurance (SSCA) Forum.”

I do want to point out that, although the SSCA clearly started out with almost all Federal government employees and contractors as members, it now includes a significant industry contingent. And the discussions aren’t usually on specific government topics, but supply chain security in general. Here are the topics for the December meeting: the DHS Task Force on supply chain security, software security, software testing, software identification, software certification, international supply chain risk management efforts, updates to US Federal Government policies, and more.

You can sign up for the December meeting here. If you’re a non-US citizen or green card holder, you need to sign up by 12/4. If you’re a US citizen or green card holder, you have until 12/11. And if you want to get on the mailing list, drop an email to Bob Martin at ramartin@mitre.org.

Here are some interesting things I learned from the September meeting. They’re just in the order of the presentations where they were stated. Note that, even when I use quotation marks, I can’t vouch that this is a verbatim transcription of what was said.  

NIST:
CSF R1.1 has a supply chain focus.
800-53 R5 has a map to the CSF, as well as to 800-161 (the supply chain security framework).
800-37 addresses Enterprise Risk Management.

The Open Group:
The O-TTPS cyber security standard is for suppliers. It’s available as a certification.

DoE:
The CITRICS program is testing ICS components for security.

FERC:
Rhonda Dunfee of FERC pointed out that CIP-010 R1.6 just deals with software security patches, not firmware (even though CIP-013 R1.2.5 can be interpreted as applying to firmware as well as software patches).

Unknown:
“Many organizations fear the auditor more than the attacker.” (how true!)
“Software quality defects are really security defects, although the converse is not necessarily true. Quality and security defects are usually dealt with in separate efforts, although it would be better if that weren’t true.”
“If you talk to a CEO about risk, they’ll assure you they have that covered. What they usually mean is financial risk. If you talk to them specifically about operational risk (including cyber), they’ll give you a blank stare.”

Edna Conway of Cisco (a well-known supply chain security expert, and fun to listen to):
“Quality and security aren’t achieved through contract language….If you are dealing with your vendor in a courtroom, you’ve failed.” (how true that is! For my first post on this idea – there will be further posts, I can assure you – go here. There is a follow-up to that post here)

Office of the Director of National Intelligence (this was the best presentation, IMHO):
·         “The hard part in supply chain security is getting all the right people together, not the actual measures.”
·         A good tabletop exercise: What are factors that would make you say no or yes to a new business relationship?
·         “When it comes to SCS, there is no ‘easy’ button: ‘Just tell me what I need to do.’”
·         The Kaspersky threat has been known for a long time. The ODNI talked with supply chain people and warned them of this. But at the time they didn’t have evidence of malfeasance, so many government organizations (and private ones) signed up with Kaspersky anyway.
·         You can’t trade off advantages in cost or schedule for security performance.
·         SCS can’t be a compliance checklist approach; there must be risk mgmt. (how true this is! This is the point of the multiple posts – such as this one - that I’ve written on why CIP-013 R1.1 is the heart of the standard, while R1.2 is just a sideshow, although a mandatory one)
·         4 pillars of supply chain: cost, schedule, performance and security
·         “Security should be like quality. It should be expected all of the time.”
·         “It is not at all inevitable that added security increases cost.”

Howard Gugel, NERC (who spoke on CIP-013):
(in response to a question of whether CIP-013 addresses security risks from custom-developed software) “Those risks are covered in other CIP requirements, notably CIP-010 R3, the Vulnerability Assessment requirement.” I disagree with this, since having an every-15-month VA isn’t the same as having regular patches from a vendor, as new vulnerabilities are identified. This is one of the many risks that should be addressed – for those entities to which it’s applicable – in R1.1. But I haven’t heard NERC say much at all about R1.1 – the focus in everything I’ve heard has been R1.2, and on using contract language to comply with R1.2. IMHO, both of these emphases miss the whole point of the standard.

NCCIC (DHS):
Someone from NCCIC discussed the Russia attacks (about which I wrote a whole bunch of posts this summer, starting with this one), and pointed out that most of the entities compromised were small. This is all well and good, but that certainly wasn’t the tenor of the presentations they gave, even though perhaps they never explicitly said otherwise. It sounded like major utilities had been compromised, and it was just going to take one nod from Vladimir Putin to send the whole US into darkness. That was certainly the tenor of the major news articles, and I haven’t heard of DHS issuing any public (or private) apology for misleading the press on this matter.

This person also seemed to make yet another attempt to try to make sense of those contradictory briefings –as well as what was put out on the subject early this year – by saying that DHS just didn’t understand the difference between “vendors” and “utilities”. I believe we’re now on at least Version 5 of the ongoing saga “What DHS really meant in those briefings”. Unfortunately, this latest explanation is no more consistent with the actual statements than are any of the previous ones. I hope to do a post, summarizing what I understand of this sorry tale, in December.


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013. We also work with security product or service vendors that need help articulating their message to the power industry. To discuss this, you can email me at the same address.

Thursday, November 15, 2018

Stop Making Sense: the Remix



I have been working this year with two organizations that are getting ready for CIP compliance at the Medium impact level, as well as a Canadian organization with Highs, Mediums and Lows, that only had to comply with CIP starting in 2017. These discussions have brought to my mind a number of issues that I addressed in posts and discussed with various entities, as the whole industry was getting ready for CIP v5 compliance in 2014-2016.

A very small number of these issues were actually resolved by revising the standards (although I really can’t think of any issue that was completely resolved in this way. Since changing one of the CIP standards is easily a 3-5 year process from initial idea to having an enforceable standard, it’s not surprising that this happens very rarely). A number of these issues (especially the various problems with CIP-002 R1 and Attachment 1, about which I wrote at least 50 posts, starting with this one) are no longer burning issues, since the auditors and the NERC entities came to a rough consensus on how to resolve them (even though it’s not supported in the requirements as written).

However, there are some issues (like the cloud) on which there hasn’t been any resolution of any sort. The only reason why you don’t hear complaints every day about these issues (from me as well as from others involved with CIP compliance) is that it’s quite clear that nothing can be done about them in the near term, except to do your best to work around them. In fact, I’m sure a lot of CIP compliance people have probably forgotten that these even were issues in the first place. They’re simply part of the landscape, like a mountain. If you’re going from point A to point B and there’s a mountain in between, you’re not going to spend a lot of time complaining about the fact that the mountain is there – you’ll just allow for that and make a big detour when you come near it. But the cost of having to make that detour is very real.

In my discussions with these three entities, they of course have questions on what particular requirements or definitions mean, and naturally try their best to make sense of them. For example, they might say “Surely CIP-007 R2 doesn’t require that we have to check every 35 days, for every piece of software installed within the ESP, to find out if the vendor has released a new security patch! What if the vendor has never released a security patch in ten years? Or what if the software involved is fairly insignificant and has minimal impact on the BES? It only makes sense that we should be able to devote our resources and time to what poses the most risk to the BES.”

To which I will (sagely) reply “I totally agree it makes sense that you should be able to look at risk in how you comply with this requirement (and all of the other CIP requirements as well). But what makes sense has nothing to do with CIP compliance. The only thing that matters is the strict wording of the requirement. If something is allowed by the wording, that’s what you do. If something isn’t allowed, you don’t do it. And if the wording is ambiguous (which it is very often, of course), well…that’s for another discussion.”

I wrote a post about exactly this issue in 2016. I just reread it, and it’s just as relevant now as it was then (in fact, the question I discussed in the post was about the cloud. I agreed then that it would absolutely make sense if NERC entities were free to entrust their BCSI, and their BCS, too, to cloud providers that had demonstrably good cyber security practices – but of course this just wasn’t allowed by CIP. If I’m not mistaken, this is as much an open issue today as it was in 2016, in fact much more so).

However, I’m pleased to say that there’s been some slow progress on addressing this issue. I pointed out in the post that the reason why NERC entities can’t apply reason very often in CIP compliance is that most of the CIP requirements are prescriptive. This is still the case, but it’s also a fact that every significant new CIP requirement developed since CIP v5 has been what I call plan-based, and what others call objective-based: That is, the requirement mandates that the entity develop a plan to achieve a particular objective. The means of achieving it are up to the entity, as long as what you propose in your plan is – are you ready for this? – reasonable. Isn’t that amazing? Not only can you use reason in developing your plan – that is the criterion by which the plan will be judged!

These requirements take different forms, but fortunately they all come down to developing a plan to achieve an objective. The current plan-based requirements are (in order of their development):

CIP-007 R3, Anti-Malware: This was developed with CIP v5. It doesn’t specifically require a plan, but it does definitely state an objective: mitigating the risk posed by malware. You have to go through a planning process to achieve that, since you need to do this for each BES Cyber System, EACMS, PACS and PCA, even though the appropriate means of risk mitigation may be different for different devices.

CIP-011 R1, Information Protection: This was also developed with v5. It requires you to develop an information protection program (same as a plan, as far as I’m concerned) for protecting BES Cyber System Information in storage, transit, and use.

CIP-010 R4, Transient Cyber Assets and Removable Media: This was developed with v6.  The entity must develop a plan to mitigate the risks caused by use of TCAs and RM within the ESP.

CIP-003 R2, Assets containing Low impact BCS: In the v7 version due for compliance on 1/1/20, the entity needs to develop a plan for protecting assets containing Low impact BCS in five different areas.

In addition, there’s CIP-013, which is due for compliance on 7/1/20.  It is the first CIP standard (in fact, probably the first NERC standard, period) that explicitly allows consideration of risk. It does this because FERC knew, when they wrote Order 829 in 2016, that there is no way for a utility to cost-effectively mitigate supply chain security risks if they don’t develop a plan that is based on risk.[i] CIP-012, currently being balloted, is also plan-based.

And this isn’t the end. The CIP Modifications Standards Drafting Team is working on (and currently has out for comment) revised CIP standards and definitions to officially permit CIP compliance for virtualized systems (as opposed to NERC’s current “Don’t ask, don’t tell” policy for virtualization, as well as any other area where the current standards are either ambiguous or just don’t reflect reality).

One key component of this effort is rewriting the most prescriptive requirements, like CIP-007 R2 and CIP-010 R1, as objectives-based. While this is being done specifically to address virtualization, I am sure all NERC entities will welcome this development, regardless of whether they have virtualized systems in their OT environment or not. That's the good news. What's the bad news? The changes the SDT is talking about now are years away from coming into effect. In the mean time, everyone still has the prescriptive requirements they love so much.


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post, including compliance with CIP-013; we also work with security product or service vendors that need help articulating their message to the power industry. To discuss my services, please email me at the same address so we can set up a time to talk.



[i] And I would take this one step further: There’s no way that a utility can cost-effectively mitigate any security risks except by developing a plan that is based on risk. So what about all the risks addressed by the current prescriptive CIP requirements? They are being effectively mitigated now, but at a cost that is much higher than is needed, simply because the prescriptive requirements don’t allow for any consideration of risk. More importantly, there are a number of major risks – phishing, ransomware, APTs, etc. – that aren’t addressed by CIP at all now, and there’s no appetite on FERC or NERC’s part to go through the long and painful standards development process to incorporate these risks into CIP. If all of the current prescriptive CIP requirements were replaced by a single requirement that mandated that the utility develop a plan to achieve the objectives of mitigating each of the risks currently addressed by the prescriptive requirements, as well as other risks not addressed at all now, then the utility could decide how to allocate its resources, based on risk, so that the maximum amount of total cyber risk was mitigated, given the dollars available to spend. In my opinion, this would be the ultimate solution to CIP’s problems.

Friday, November 2, 2018

Hoover Dam









I must admit that Las Vegas isn’t my favorite city. In fact, I don’t think it makes my top 100 or even top 500. That’s why my heart sank when I heard last year that this year’s GridSecCon would be there. I was especially disappointed because all the other GridSecCon’s that I’ve attended have been in (or near) old historic cities with nice architecture to look at: St. Paul, MN last year, Quebec City in 2016, Philadelphia in 2015, San Antonio in 2014 and Jacksonville, FL in 2013 (OK, Jacksonville itself doesn’t count as old and historic, but it’s not far from St. Augustine, which is definitely both!).

However, there was one really good thing about Las Vegas: It’s very near Hoover Dam, which I’d never seen. And since there was a conference tour scheduled for Friday (there are always tours on Friday morning of the conference, to power plants, substations, etc.), I signed up for that early.

The great thing about this tour was that it was for power industry insiders and was set up by WECC, who co-ran the conference and of course audits the US Bureau of Reclamation, which runs the dam. This proved very fortunate in three ways:

  1.  We got a good briefing on the dam before our tour, and during the tour we had two longtime workers who encouraged us to ask “any dam question you want”. We heard a lot of interesting stories about construction, etc., including the poignant fact that, of the 112 deaths associated with construction, the first and the last were a father and his son
  2. You have to take an elevator down into the dam for the tour. The first elevators take you to the level where all the generators are visible and we could walk among them – quite neat, of course. But there’s a second, smaller set of elevators that go down to the waterline lever, where you can walk out along the base of the dam. And that was very cool, indeed. This second level isn’t available for public tours.
  3. There are two elevators from the top of the dam. One is for the public tours, and has an hour wait (it looked like that when we were there). The other is for the workers, which we went on – thus saving a whole hour!

It’s obviously quite an engineering marvel (I love the figure we were quoted regarding throughput: 92,000 gallons per second go through each half of the dam – there are identical halves, one in Nevada the other in Arizona, since the state line runs down the middle of the river – meaning the total is about 184,000 gallons per second!).

But what I found most impressive was how the people we talked to really believed in what the dam was meant to do. Of course, irrigation and flood control were its biggest purpose, but the second was bringing electricity to millions of people, many of whom hadn’t had it before. And a third purpose is seen in its construction period: 1931-1936, the depths of the Depression. It gave jobs to many thousands of people (almost all men, I’m sure. And virtually no minorities. There were limits to the idea of progressivism back then). In an era when government is often seen as at best ineffective and at worst destructive, this is a government project that can truly be said to have done a huge amount of good. And I suspect there are one or two more like it…

I want to thank Brandy Daniels of WECC, who organized the tour and provided the group picture.


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013; we also work with security product or service vendors that need help articulating their message to the power industry. To discuss this, you can email me at the same address.