Digital points of failure: A shared need for cyber resilience
Our world depends on digital information systems — and they're often vulnerable. These systems don’t run themselves but require diligence and protection. One of the most high-profile examples last year was the CrowdStrike incident, which led to widespread computer outages and a potentially costly lawsuit over flight delays.
Accidental errors and disruptions can have enormous effects, and intentional malicious acts could be even more devastating. That's why cyber resiliency is so important. A single incident can go beyond one organization and affect entire industries.
The CrowdStrike issue
On July 19, 2024, many computer systems across the U.S. and the world were crippled. CrowdStrike, a cybersecurity and software company, had sent out an update in one of their cybersecurity software products, which had a bug that crashed Windows computers and entire information systems. Some businesses, including some airlines, were brought to a standstill.
Once the company created a fix, the installation process was time-consuming and laborious, requiring individual attention for every computer.
Systems were impaired, and businesses lost money. Some reports indicate billions of dollars were lost in total due to this accidental failure.
Litigation has ensued, with Delta Airlines and CrowdStrike each suing each other in October 2024. In general, each alleges the other was negligent and deficient and responsible for Delta’s significant outage and resulting damages. (CrowdStrike did accept responsibility for the bug and certain consequences of the outages).
Lessons for our cyber world
The first lesson for everyone is that we are reliant upon this digital world. Every person, business and government agency relies upon digital information systems. It is not an option or convenience but a necessity. They make life easier, faster, and better, but they also require effort to configure, secure and maintain. Failure can mean inconvenience and even disaster.
Consider dams and their lakes, feats of engineering and ingenuity that create beautiful waterways to use and enjoy — and even electricity. However, they require continual upkeep, and their failure is catastrophic. Similarly, think of our cars, which also require work to design, build and maintain.
We have created a cyberworld that can fail, too, and just as spectacularly as a dam failure.
Some assume information systems can continue by themselves, or they relinquish responsibility over them to others. That is a mistake.
Accidents and Murphy’s Law
Murphy’s Law states that anything that can go wrong, will.
This is not a “law” that lawyers or scientists would recognize but merely an ironic and pessimistic joke. It is a caution to optimists who assume everything will fall into place. Fortunately, things go right much of the time, sometimes by luck, but that becomes increasingly unlikely as systems become more complex.
FREE role-guided training plans
Things generally move towards disorder and dysfunction all on their own. It takes energy and effort to build, organize and keep things running and standing, whether dams, cars or information systems.
Our computers and network connections are increasingly complicated. They require effort and good management because mistakes can happen, even with the best intentions and high competence and diligence. Add in negligence (some individuals and organizations do not meet the standards of skill and care), and that greatly increases the chances of failure.
Malicious actors
Now consider a person or group who wants bad things to happen and is willing to devote time, effort and resources to achieve it. A malevolent individual or entity seeks to cause harm. They will take advantage of existing flaws and weaknesses and do more devastation than Murphy’s Law could.
They are the main reason we need security features, security software, organizational cybersecurity programs, and our profession in general. They are the reason we need to be constantly vigilant and adapt to new attacks.
Some malefactors just want to profit from run-of-the-mill cybercrime. That is a big problem, yet small in comparison to threats from adversary nation-states. Some countries have the motivation and resources to plan a national disruption by attacking our critical infrastructure. They certainly plan, and the only uncertainty is whether they will ever attempt such an attack. The U.S. recently announced a major incident involving Chinese state-sponsored actors breaching the U.S. Treasury Department.
There are laws regarding the conflict between nation-states, which encompass “cyber conflict.” Some countries disregard these laws in small and large ways.
When a country plans an infrastructure cyberattack but does not act on it (yet) the question is why. Often, it is the likely consequences and retaliation that stops them, rather than a respect for the rule of law and rights of others.
Whatever the calculus or game theory, whatever our personal view on which nations are adverse, the reality is other nations plan to breach our data and systems, to shut off our information systems, electricity, communications, and infrastructure. They even seek to influence us and our actions.
What should you learn next?
Lessons learned
The key takeaway around software is that updates should be vetted diligently and implemented over time and not instantly. If errors escaped the prior checks, they can be detected soon enough so future updates can be paused before the damage spreads.
The broader lesson is that cyber resiliency and protection need to be on everyone’s mind because we are all stuck in this cyber world whether we like it or not. Cybersecurity professionals are ambassadors for this message, including empowering others to play their important roles in management, including over information assets.
Our work matters, protecting against both accidental failure and deliberate sabotage. We can redouble our motivation to protect and persuade others of the importance of cybersecurity and solid information governance.