Wednesday, February 27, 2019

Here’s your big chance!



I’ve been saying for a while that the biggest flaw in CIP-013 is that it doesn’t provide a list of risks (or threats, as I prefer to call them, following NIST 800-30) that need to be addressed in the supply chain cyber security risk management plan required by R1.1. Because there is no list provided – beyond a very high-level list of three or four types of risk that need to be addressed – this means it is up to each entity to decide a) what are the most important supply chain threats that apply to the electric utility industry, and b) which of those pose the most risk to their own BES Cyber Systems. Their plan needs to describe how they identified the highest risks, and how they will mitigate them.

The problem is that it isn’t an easy task to look through the literature and identify the most important risks the industry faces. Larger utilities might have the staff to do this type of work, but smaller utilities definitely don’t. This is why I said a year and a half ago, in relation to CIP-013 R3, that there should be an industry body tasked with identifying threats that NERC entities would consider in developing their CIP-013 plans. This body would publish periodically (at least annually) a list of cybersecurity threats to the power industry (and specifically to BES Cyber Systems). NERC entities would need to annually determine which of these threats posed significant risk to their BCS, and would next need to develop a supply chain cyber security risk management plan to mitigate those risks.

So if an industry body developed a list of supply chain cyber security threats (or risks, to use CIP-013’s term) that are important to the electric power industry, this list could well provide the starting point for the supply chain cyber security risk management plan required by CIP-013 R1.1 (which I’ll call the “CIP-013 plan” from now on). This definitely doesn’t mean that NERC entities would have to mitigate all of the threats on the list, since a) some won’t apply to some utilities and b) some will be determined to carry low enough risk to a particular utility that no mitigation is called for.

Of course, if a utility has an unlimited budget for supply chain cyber security risk mitigation, they can mitigate every threat on the list, no problem. However, for those that don’t have an unlimited budget, they have to spend the budget they have where it will do the most good. It will do the most good if they identify the threats that pose the highest risk, and develop a plan to mitigate those threats. In fact, I think that all of the CIP standards should work this way.

However, creating an official body to identify threats for NERC entities to consider in their CIP-013 plans, along with the required changes to CIP-013 itself, would take years to accomplish. This does nothing to solve the current problem of complying with CIP-013-1.

So if there isn’t going to be an industry body that officially tells NERC entities what threats they should consider in developing their CIP-013 plans, what’s plan B? In the middle of last year, I (and a few others) started talking about an existing industry group (i.e. not part of NERC) doing that, although industry groups might be a better word. I was hoping that the trade associations could be convinced to identify the most important supply chain cyber security threats that their members were likely to face. Of course, no NERC entity would be compelled to even read the threat list put out by their trade organization, let alone act on it. But for many (and probably most) entities, it would be a big help to have a list that would give them a good start for developing their CIP-013 plans. As it stands now, there is no obvious starting point for them.

I’m not discussing some sort of theoretical problem. I’m now working with two NERC entities on long-term projects to implement CIP-013 compliance, and have started both projects with workshops where we discuss the issues and the best way to proceed. It has readily become apparent that the two biggest tasks – which the entity has to take ultimate responsibility for – are to a) identify the threats they will consider in their CIP-013 plan; and b) estimate the risk that each threat poses to their environment, so they can focus their mitigation efforts on the threats that pose the highest risk. Both of these entities (one large and one medium-sized) would benefit from having an industry group that would consider supply chain security threats and let the industry know about threats that it deems important for NERC entities to consider for mitigation in their CIP-013 plans. However, until a few days ago I saw no hope that any industry body might be willing to take up this task.

What changed my mind was learning earlier this week that a new group I have been nominally part of for more than a month (although I’ve only been able to attend one phone meeting of theirs so far) seems to see the need for identification of important supply chain threats, and is going to start that effort next Wednesday. This is the NERC Supply Chain Working Group, which is “chartered” by the NERC CIPC (the industry group that oversees all NERC cyber security activities. The CIPC’s duties include following – but not having any direct role in – development of new CIP standards and requirements).

In their agenda for their first onsite meeting under their new chairman, Tony Eddleman of the Nebraska Public Power District, the SCWG lists five papers they want to write, each one focused on a particular area of supply chain security:

                                                 ii.      Considerations for secure hardware delivery
                                               iii.      Considerations for establishing provenance of systems and components
                                               iv.      Considerations for threat-informed procurement language
                                                 v.      Considerations for supply chain risk management lifecycle (assessments & reassessments, external dependencies, concluding supplier relationships)
                                               vi.      Considerations for unsupported or open-source technology

Note: I deliberately omitted item i on the team’s list, which is “Supply Chain risks related to cloud service providers”.  While this is an important topic for NERC entities nowadays, I don’t call this a CIP-013-related task. CIP-013 currently only applies to BES Cyber Systems, but the CIP standards effectively forbid entities from implementing actual BES Cyber Systems in the cloud – for example, outsourced SCADA (at least, this applies to Medium and High impact BCS. There’s currently nothing to prevent entities from implementing Low impact BCS in the cloud). However, a growing number of NERC entities is storing information on BCS (BCSI) in the cloud, as part of outsourced services like configuration management. But CIP-013 doesn’t apply to BCSI.

Of course, these are only five areas of threats within the universe of supply chain security threats; I’m sure that a complete overview of supply chain security threats would require ten or more additional papers. But all five are difficult topics, and I commend the committee for taking these on; maybe they will be persuaded to tackle the others later.

But this is where the title of this post comes in. The SCWG is open to all, whether or not you work for a NERC asset owner (you do have to be a user of electricity, but I’m not sure how you’re reading this post if you’re not!). Their meeting next week is in Pittsburgh, in conjunction with the CIPC’s quarterly meeting there, but there will also be a webinar for those who can’t be there in person (I unfortunately can’t do either, since that is the day I’m participating in a panel at the RSA Security Conference whose topic is…what else?...supply chain security for the energy industry).

Whether or not you can attend next week’s meeting, if you would like to participate in the CSWG and have a hand in writing one or more of these papers, drop an email to Tony at tdeddle@nppd.com. Fame and fortune surely await you[i]!


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013. To discuss this, you can email me at the same address.

[i] However, I’m not liable if you don’t earn fame or fortune from this. What you will earn is a good feeling that comes from helping a) the industry and b) your own organization as they address the issue of supply chain security, which I believe is easily the biggest worldwide cyber security threat of our time.

Sunday, February 17, 2019

Are you going to RSA?



If you’re attending the RSA Security Conference 2019 in San Francisco in 3 weeks, I want to remind you that I’ll be participating in a panel titled “Supply Chain Security for Critical Energy Infrastructure” at 8:00 AM on Wednesday, March 6. The panel I was on at last year’s conference sparked a good discussion (which you can listen to here). Since the panelists are the same (just our moderator is different. This year it’s Sharla Artz of UTC), I think it will again be quite interesting.

I just searched the conference web site for other events on supply chain security and the electric power industry. Our panel is one of the few addressing the power industry, but there are a number of other panels and presentations that touch on supply chain security in one way or another. So you might justify coming to this year’s conference by the fact that you’ll get a leading-edge update on this topic.

BTW, if you’re thinking of coming but are despairing of finding a hotel room, I want to point out that there are still a lot of Airbnb’s available, at prices that don’t seem much different from what they were a couple months ago. Not only will you save a LOT of money, but you just might end up staying in a part of town you would find interesting (there are a lot of those in San Francisco, of course). For example, I got a great Airbnb for last year’s conference that was a couple blocks from Golden Gate Park. I got hooked on going running there every morning – including running through a redwood grove and the wonderful gardens. So this year I’m staying near there again. Can’t wait!


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013; we also work with security product or service vendors that need help articulating their message to the power industry. To discuss this, you can email me at the same address.

Sunday, February 10, 2019

“Curiouser and curiouser, cried Alice…”


Imagine what might happen if the following news was announced:

“The Ukraine State Intelligence Service stated in its just-released Worldwide Threat Assessment that Moscow is now staging cyberattack assets to allow it to disrupt or damage the Ukraine’s civilian and military infrastructure during a crisis.

“It specifically noted the Russian planting of malware in the Ukraine electricity grid. Russia already has the ability to bring the grid down “for at least a few hours,” the assessment concluded, but is ‘mapping our critical infrastructure with the long-term goal of being able to cause substantial damage.’”

And what if this news came only a few weeks after a Wall Street Journal article quoted the Technical Director of Security Response of Symantec Corp. as saying “…about two dozen Ukrainian utilities were breached. Hackers penetrated far enough to reach the industrial-control systems at eight or more utilities”?

Don’t you think this would cause a big stir? After all, in 2015, when the Russians staged a successful attack on three Ukrainian distribution utilities, causing about a five-hour outage that affected hundreds of thousands of people, the news hit the US power industry like a thunderclap. Top security professionals from the Department of Homeland Security, the NERC E-ISAC, SANS, DoE and other organizations immediately jumped on planes and headed to the Ukraine to investigate this. DHS held briefings in many American cities. Reports were published detailing what had happened down to the minute.

This was considered to be a watershed for the power industry worldwide (the first reported loss of load due to a cyber attack), and – while many industry observers gloated that the Russians would never be able to be so successful in the US, due to much stronger cyber security controls here and also due to the NERC CIP standards! – many others weren’t so sure, and said the Ukraine situation was more a case of “There but for the grace of God go I.”

Yet the 2015 attacks were on just three distribution utilities. Since the attacks described above breached two dozen utilities and penetrated the control systems of eight of those, it’s a very good assumption that malware was planted that could lead to a far more serious outage. Don’t you think there would be a much bigger response to these new reports? More specifically, don’t you think there would be another big investigation, for two reasons? First, out of simple goodwill toward the Ukrainian people, since they face a huge and ruthless foe? And second, out of concern that whatever attacks the Russians are conducting in the Ukraine are tests for attacks they could use on power grids worldwide?

At this point, you’re supposed to say “I would certainly think so!” And I agree with you 100%.

Well, the quotes above were actually published, the first in the Times and the second in the Journal. But there were a couple small differences between what I’ve quoted above and the actual quotes. One is that the country in question was the US, not the Ukraine. The other is that the agencies that wrote the 2019 Worldwide Threat Assessment were the FBI and CIA. I wrote about the NYT article in this post and the WSJ article in this one.

Yet where is the outrage? Where are the frenzied press releases and briefings? And where are all of the investigators rushing to find out what happened? Does anyone know where they are? I hope we don’t have to put them on milk cartons.

Let’s be clear. The Times quoted the 2019 Worldwide Threat Assessment put out by the FBI and CIA as saying

  • Moscow is now staging “cyberattack assets” (which presumably include malware) to allow it to disrupt or damage our civilian and military infrastructure during a crisis.
  • Malware has been implanted in the US grid that could be used today to cause outages.
  • Perhaps most ominously, Russia is mapping our critical infrastructure with the long-term goal of being able to cause substantial damage.

At the same time, Symantec, who has collaborated with DHS in investigating the Russian attacks in the US, is saying very specifically that at least eight US utilities have been penetrated at the control system level, meaning malware is almost certainly planted in all of them. Hopefully the eight utilities don’t include Southern Cal Edison, PG&E, ConEd, Commonwealth Edison, CenterPoint and other utilities serving major metropolitan areas. But even if they’re all small distribution-only coops in the middle of North Dakota, eight US utility control networks penetrated is still eight more than are known to have been penetrated previously. And as we know, utility control centers are by their very nature connected to other utility control centers as well as to Regional Transmission Organizations like PJM. The infection might very well spread.

Here’s another quotation from the January WSJ article: “In briefings to utilities last summer, Jonathan Homer, industrial-control systems cybersecurity chief for (the Department of) Homeland Security, said the Russians had penetrated the control-system area of utilities through poorly protected jump boxes. The attackers had ‘legitimate access, the same as a technician,’ he said in one briefing, and were positioned to take actions that could have temporarily knocked out power.” Again, Mr. Homer wasn’t saying that outages were caused, but the fact that the Russians were “positioned” to do that almost certainly means they’ve planted malware in control systems operated by at least two utilities (since he used the plural).

Of course, none of these reports should just be taken at face value. Some of the people quoted may not have fully understood what they were saying; e.g. they may have meant “small generating plants” when they said “utilities”, etc. And I don’t know what kind of power expertise the FBI and CIA have, but it’s possible they may be misinterpreting data they’ve received. So there’s reason to be skeptical of these reports.

But here’s an idea: If we’re skeptical of these reports, why don’t we…you know…investigate them to determine whether they’re accurate or mistaken? Yet I’ve heard literally nothing about any investigation. Nor have I heard the slightest bit of outrage expressed – by the Federal government, the power industry, you name it – that the Russians are taking such deliberate steps to potentially cripple the US economy and our military capabilities. And DHS has amply documented that they are taking those steps, whether or not they’ve actually penetrated control networks. They’re trying really hard.

This lack of a response is more than passing strange. I would very much like to see one (or more) of the following organizations investigate this (they’re not in any particular order):

  1. The NERC E-ISAC
  2. FERC
  3. Idaho National Lab
  4. SANS
  5. DoE’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER)
  6. Dragos, Inc. (who did a great job of investigating the malware used in the second Ukraine attacks, and due to that and other smart moves has become almost an ICS security institution, much to their credit)
  7. Hercule Poirot
  8. James Bond
  9. Judge Judy
  10. Sam Spade

In other words, I would like to see somebody get to the bottom of this and let us know what happened. And of course, if it turns out that malware has actually been implanted, wouldn’t it be kind of a good idea to…you know…let utilities know about it – so their cyber staff might just mosey over to their control systems, to see if the malware might be sitting there, too? Why would they want to do this, you ask? Well, curiosity for one reason – it would certain be interesting to know if your employer was a member of the first group of US utilities ever to be breached at the control system level. But also - and this might sound silly to you - it did occur to me that utilities might actually want to remove malware that’s implanted in their control networks. But they would need to know what to look for, since it’s not likely the Russians named the files Malware1, Malware2, etc. This is of course the main reason why we need an investigation, and I find it literally incomprehensible that one wasn’t launched at least after the Worldwide Threat Assessment in January.

As I pointed out in my previous post on this, there really are two investigations in question now. The immediate one is the one I just described – this is a technical investigation by experts. The second investigation would probably be a criminal one. It is only needed if it turns out the reports of Russian penetration of utility control centers are true, and it turns out that somebody deliberately tried to suppress them last summer, when Jonathan Homer of DHS first made them and people at DHS soon put out at least three mutually contradictory stories that minimized what the Russians had achieved. I certainly hope this second investigation isn’t needed – but again, unless we do the first investigation, we’ll never know if the second one is needed, will we?

Curiouser and curiouser, indeed!


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com

Sunday, February 3, 2019

My quote in the WSJ



Yesterday’s online Wall Street Journal carried a very good article by Rebecca Smith, titled “Duke Energy Broke Rules Designed to Keep Electric Grid Safe”, on what I can finally call the Duke Energy penalty of $10MM for NERC CIP violations. The last paragraph (which was cut out of the print edition) reads:

“The state of compliance is pretty rotten,” said Tom Alrich, a utility consultant who helps utilities audit their security programs. He added that he knows Duke spends a lot of money on its critical infrastructure protections. “I really doubt they are much more insecure than anyone else,” he said.

While I said those words, I do want to point out that they were part of a much longer discussion with Rebecca on Friday, which I believe she will expand on in a future article. Because I think my comments above are likely to be misunderstood without context, here’s a summary of the argument I was making (although I’ll admit that it was only partway into our discussion that I began to understand what I now believe to be the biggest issue in the Duke penalty. The neatness of my exposition below – if it is neat - is only with the benefit of hindsight!).

  • I think the cyber security of the North American Bulk Electric System is very good, and that is in large part due to the CIP standards; I have had this opinion for years.
  • But there is a big problem with CIP (actually, there are four or five, but this is the biggest): The requirements don’t follow a risk-based approach, in which the entity first identifies the most serious cyber security risks they face, as well as the most important systems that needed to be protected. This would allow NERC entities to allocate their scarce cyber security resources toward mitigation of the biggest risks, and to the systems that most need to be protected.
  • The CIP-013 and (to some extent) CIP-014 standards follow this latter approach, as do individual requirements like CIP-010-2 R4, CIP-011-2 R1 and CIP-007-6 R3. On the other hand, there are very prescriptive requirements (the two worst offenders being CIP-007 R2 and CIP-010 R1) that require an inordinate amount of resources to comply with. And even with a lot of resources available, it is literally inevitable that a large organization like Duke will regularly suffer multiple failures on these requirements. I know all of them do.
  • Of course, Duke evidently had more than their share of failures; this was without a doubt in part due to the number of acquisitions they have made of late, which almost always leads to compliance problems. And NERC rightly points to management failures as the main cause of the problems, which probably explains the size of the fine. As we know, sometimes it takes a shocking event to get management’s attention.
  • NERC also points out (in the second paragraph on page 12 of part 1. Due to the high security implemented on the PDF, I can’t copy any text from the document) that the risk to the BES in Duke’s 127 CIP violations is much more due to their sheer quantity, and the fact that many of them were of long duration and were repeated, than to the number that were “serious” (13 were serious, while the rest were either “minimal” or “moderate” risk).
  • As further evidence, “Jason R” (whom I know well, but I’m withholding his name at his request – not an unusual request in the utility industry!) pointed out in a comment posted yesterday on my last post that a quick scan of the requirements that Duke violated didn’t indicate any violations of requirements that apply only to High impact assets. Since High impact Control Centers are the crown jewels of the BES, this implies that the violations were concentrated in Medium-impact substations or generating stations. An attack on any one of these (or even a number of them) would be much less serious than an attack on a single High Control Center. This supports NERC’s statement in the previous bullet point.[i]

But don’t get me wrong. I’m certainly not saying that Duke is being unjustly punished for minor violations that everybody else does all the time. They are definitely a very serious violator of NERC CIP, and their penalty reflects that fact.

But here’s the rub: What does this mean for security? Was (or is) Duke a ripe plum just waiting for a halfway-competent Russian-sponsored hacker, or maybe even a script kiddie in Finland, to take over and hold maybe five or ten percent of the Eastern Interconnect hostage (perhaps until the US recognized the Donbass area of the Ukraine as an independent country)? Or would destructive software like Shamoon or NotPetya have trashed their control centers and darkened major cities for days? No, I don’t think so. This was a major compliance miss by Duke, but I’m sure (and NERC says as much, multiple times) that the chance of any cascading outage (or even just a local outage) was always very remote.

So what will be the effect of Duke’s penalty? Obviously, they’re going to have to (and already are, I’m sure) devote an even-bigger amount of resources toward CIP compliance than they already are. They especially need to beef up their compliance management and oversight processes, which NERC says are the big problem. This will definitely make Duke much more CIP-compliant, without a doubt.

Now, let’s think about the overall security of the grid (or at least Duke’s portion of it); will that be enhanced to the same degree as Duke’s compliance posture? Not a chance, and here’s why. Any NERC CIP compliance professional will tell you that a large percentage of the money and time that their organization spends on CIP compliance doesn’t go to enhancing the security of the grid, but to required activities that require large amounts of time, which are way out of proportion to whatever security benefit they provide.

My poster child for this is CIP-007 R2.2, which requires that the entity check every 35 days with every vendor of any software or firmware installed on any device within a Medium or High impact Electronic Security Perimeter to see if a new security patch has been released since the last month – and this regardless of whether the system is extremely critical or just mildly important, or whether or not the vendor has released a single security patch in the last twenty years. And the entity has to document – again, for every version of every piece of software or firmware installed on any device in a Medium or High impact ESP - that all of this was done every 35 days, since a NERC rule is that if you didn’t document it, you didn’t do it. For a large entity like Duke, this probably amounts to hundreds or even thousands of software or firmware packages (and each different version thereof, of course) that need to be checked on every month, along with documentation that each one was checked on.

I’ve asked a number of CIP people to estimate what percentage of every dollar they spend on NERC CIP compliance actually goes to cyber security; their estimates have ranged from 25% to 70%. Think about it. They’re saying that the absolutely best case is that 70% of spending on CIP compliance actually goes to enhance BES security, and the worst case is that only 25% does! I honestly think the median is probably around fifty percent.

So it’s clear: Duke will spend a boatload of time and money on enhancing their compliance posture, and while this will all go to improve compliance, probably only about half of it will go to enhancing their security posture. Moreover, even if 100% went to security, that still wouldn’t make them secure, since there are so many cyber security threats (and they’re unfortunately growing all the time) that aren’t addressed in CIP at all. Four big ones I can think of without breaking a sweat:

  1. Phishing, the ultimate cause of the Ukraine attacks, and the main focus of the current Russian attacks on the US grid. It is very hard to even think of a major cyber attack in the last few years that didn’t start with phishing.
  2. Ransomware, which has already forced a couple of US utilities to pay ransom, but which fortunately hasn’t yet penetrated control networks. In this category I also include NotPetya, at $10 billion in damages by far the most destructive cyber attack ever (of course, courtesy of our good friend Mr. Putin).
  3. Machine-to-machine access to BES Cyber Systems. This is specifically exempted from control by CIP now. This threat will be mitigated to some degree when CIP-013 comes into effect on July 1, 2020. Meanwhile, it’s what worries me most about the Russian attacks on vendors – that they could take over devices operated by the vendors, that communicate directly with BCS. It’s very hard to see how an entity could cut off a well-planned MtM attack before it could cause serious damage.
  4. Vulnerabilities in custom software, running on BCS, that was developed by or for the NERC entity. Since there aren’t typically regular patches made available for custom software, it isn’t covered at all by CIP-007 R2 - and this threat also won’t be addressed in CIP-013. When government supply chain security professionals heard, in a presentation by Howard Gugel of NERC at the Software and Supply Chain Assurance conference in McLean, VA last September, that vulnerabilities in custom software aren’t addressed in any way in any of the CIP standards including CIP-013, they were incredulous.

Of course, I’m sure utilities are spending money and time mitigating these threats (and many others not addressed at all in CIP), because they know it’s important. But here’s the root of the problem: Money doesn’t grow on trees, even for big utilities like Duke. Ultimately, if Duke has to spend a lot more money on NERC CIP compliance than they have been, at least some of it will come from mitigation of the cyber risks that aren’t addressed at all in CIP. And to the extent that these risks are more important than some of those addressed by CIP (e.g. does the risk of not identifying a new security patch within a month, for a few low-risk pieces of software, justify the enormous amount of resources currently being spent by NERC entities on compliance with CIP-007 R2.2? Especially when that same amount of resources might go to, say, additional anti-phishing training, or code review of custom software to find vulnerabilities?), this will probably – dare I say it? – result in an actual weakening of Duke’s cyber security posture, not strengthening it. And there’s nothing I just said that doesn’t apply to every other NERC entity with Medium or High impact BES assets, even those that have so far been fortunate enough to avoid $10MM fines.

I’ll admit that what I’ve just written goes far beyond what Rebecca and I discussed on Friday! Hopefully she’ll be able to explore this topic further in a future article. And whether or not she does, I will certainly do that in future posts (as I already have in a number of previous posts, attacking different sides of the problem). You have been warned.


Any opinions expressed in this blog post are strictly mine and are not necessarily shared by any of the clients of Tom Alrich LLC.

If you would like to comment on what you have read here, I would love to hear from you. Please email me at tom@tomalrich.com. Please keep in mind that if you’re a NERC entity, Tom Alrich LLC can help you with NERC CIP issues or challenges like what is discussed in this post – especially on compliance with CIP-013; we also work with security product or service vendors that need help articulating their message to the power industry. To discuss this, you can email me at the same address.

[i] The WSJ article specifically points to violations of CIP-005 R2 as serious (i.e. it seems some Medium or High impact BES Cyber Systems weren’t protected by two-factor authentication and/or encryption of Interactive Remote Access). If these were widespread (the document is so highly redacted that it’s impossible to tell whether there were a few systems unprotected or a thousand), I would agree this was a serious vulnerability (hopefully it’s corrected now). This is especially a concern in light of the ongoing Russian cyber attacks, that are in part trying to use IRA as a means of accessing Electronic Security Perimeters.