By Jeffrey Wheatman

In March 2022, the US Securities and Exchange Commission (SEC) proposed new rules for disclosure of cybersecurity incidents and management for publicly traded companies as defined under the “Securities Exchange Act of 1934.” This was a supplement/replacement for previous guidance from 2018, which is a long, long time ago within the context of Cybersecurity. After more than a year of comments and discussion, the rule was finalized this morning, by a close vote of 3-2. This is a very important set of rules and everyone should pay close attention, but for those that don’t want to read the whole document here is a high-level summary.

The rule has three key parts –

  1. Disclosure of Cybersecurity Incidents on Current Reports
  2. Disclosures about Cybersecurity Incidents in Periodic Reports
  3. Disclosure of a Registrant’s Risk Management, Strategy and Governance Regarding Cybersecurity Risks

Disclosure of Cybersecurity Incidents on Current Reports

This rule requires companies to update their 8-K (a report of unscheduled material events or corporate changes at a company that could be of importance to the shareholders or SEC) because of a cybersecurity incident that is deemed material as follows.

When it was discovered and whether it is still ongoing
A brief description of the incident (not technical)
Was data stolen, modified, accessed, or used in an unauthorized manner?
What impact did it have (here is where materiality plays in)
Has it been contained or is it still ongoing?

The requirement is to update the 8-K within four business days of the incident being designated as material. There are situations where, if the US Attorney General determines there is a national security issue, they might grant an extension of 30 days.

Comments

This was generally supported as critical for informing investors, but there were comments that indicated that this might tip off the bad actors – but let’s face it, the bad actors are almost always ahead of the defenders and they already talk to each other. 

To quote some famous sports coach, ‘I don’t care if the other guy knows my strategy, if we execute we will win.’ In other words, if we do a good job, it doesn’t matter if the bad actors know we are responding.

Some commenters thought four days was too tight a requirement. While I get this, the clock starts ticking when the materiality is declared not when the incident is discovered. The process should align with the requirement and the prep should begin as soon as a major incident is discovered.

Disclosure of Cybersecurity Incidents on Periodic Reports

This is somewhat similar to the first requirement but applied to quarterly (10-Q) and annual (10-K) reports and adds some additional information requirements.

  • Is there material impact of the incident on the company operations/financial condition?
  • Will there be future impacts on the company operations/financial condition?
  • Has the incident been remediated or is it ongoing? 
  • As a result of the incident have any changes been made to the program?

Comments

Again, this was generally supported as critical for informing investors, but a fair number of respondents were concerned that there is a lack of clarity as to where the line between the need to treat incidents as current vs periodic. And there is some justified concern that without further guidance, management teams and boards will be overwhelmed with data due to the sheer number of incidents that may need to be reported. I am in agreement with these concerns and I think further guidance from the SEC is needed.

And finally,

Disclosure of a Registrant’s Risk Management, Strategy and Governance Regarding Cybersecurity Risks

This may be the most important part. This requirement essentially says companies need to be able to articulate what the heck they are doing for cybersecurity and cyber risk management. No longer is this a problem that gets pushed down the org chart.

There are quite a few requirements in here to unpack –

  • Is there a cybersecurity risk assessment program?
  • Are third parties engaged to do risk assessments (constants/auditors/etc.)?
  • Are companies managing the risks of third-parties? Especially if those third-parties have access to PII?
  • Does the company have a process-based program (prevent, detect, respond)?
  • Do they have a continuity and resiliency program in place to minimize impact?
  • Have they implemented continuous improvement (triggered by past issues)?
  • Are cyber risks treated like all other business risks and built into strategic planning?

Comments

While most comments supported this, shockingly some pushed back against this. Frankly, I am dismayed that anyone could take issue with requirements to implement the basics for managing what is increasingly one of the biggest risks we face.

In particular, some comments said that sharing whether companies do some of these things gives the attackers information to make it easier for them to successfully attack them. Sounds to me like an ostrich sticking its head in the sand and thinking that makes them safer.

Bottom line, it’s about to get real.