Learning to live with risk

With the ongoing digitalization of marine and ports operations comes increased dependence on computer technology. Information security consultant Patricia Aas shares some perspectives on accepting, identifying, and managing vulnerability in computer systems in our increasingly digital age.

Photo by Rawpixel on Unsplash
Photo by Rawpixel on Unsplash
center

Patricia Aas has her curiosity to thank for casting her into the international spotlight. “I was wondering about how the Norwegian elections were run, so I started asking questions. I never intended to become a public figure,” she says.

She did become a public figure though, literally overnight. “When I learned that Norway was going to count a large percentage of votes only electronically, I began to wonder how they could be so sure nothing could go wrong.” Her inquiries led to Norwegian officials making a historic turnaround just days before the election: all votes would also be counted by hand. The story made headlines in Norway, and around the world.

“There are so many things that can go wrong with computer systems, but we get lulled into complacency by our own success,” Aas says. Electoral system may be an example of this, but the same principles apply to other industries, and maritime is no exception. “Once a thing has worked for a while, we just assume it will keep working. There needs to be a way to detect if something fails, and there needs to be a plan for how to fix it if it does.”

The absence of such plans has often nothing to do with technology at all, Aas says, but more with differences in understanding. “It can be difficult to discuss computer security with decision makers if they don’t have IT experience. Either they don’t understand, or they claim the expert is being too cautious. That makes it hard to get support for proper security measures.”

Taking the blame out of security

Aas observes that though we have become a computerized society, most people have not been taught how IT systems work. On the other side are the specialists who are more informed about computer risk, and she believes these experts must be allowed to bring up risk issues without consequence.

“The personal cost to the bearer of bad news can be huge. You become a constant burden, a noisemaker, and this can damage careers. Businesses want positive input from their employees, so there is really never a scenario where you are a hero when you bring bad news.”

One way to reverse this trend is to stop playing the blame game, Aas says. The IT industry has begun to employ what they call the “blameless post mortem”, where all stakeholders take a time-out to figure out what went wrong in a system failure, without assigning blame to teams or individuals. “Blame does not solve problems. Instead we need continuous learning and a larger understanding of responsibility. It’s more important to ask, ‘What can we do next time?’”

For only a few?

With only a small segment of the population aware of how computers work, the rest are often left to rely on the experts for the “truth” – not always a healthy situation, Aas believes. She likes to relate the story of Martin Luther who challenged the “experts” back when biblical texts and sermons were in Latin, and with his translation of the bible to German, helped people begin constructing their own understanding of the order of things.

Aas looks forward to the same level of mutual enlightenment in IT – across all industries. “Some efforts are being made, like Code Clubs in UK. We have also considered offering coding classes for executives. This is not about programming, or being a programmer, but simply understanding what coding is. I think this kind of investment would impact the bottom line, if we started dealing with risk through informed decisions, on a par with business decisions.”

Understanding risk

More aspects of business are exposed in the age of Big Data and cloud computing, Aas points out, and many exposures are not detected. “We do not understand the inherent risk. This requires an understanding of IT as a function that underpins the entire business. It also requires systems thinking, and accepting that relationships between things influence each other. There is never one single cause for an incident.”

Acknowledging risk may require resources, but the alternative can be far worse, Aas believes. “System failures impact not just earnings, but brands. A company’s entire trust base could be wiped out in a single major incident.”

Far from preaching doomsday, though, Aas is merely advocating awareness. “It’s easy to start depending on a system without realizing how dependent you are, and dependence is vulnerability. With the proper awareness it is possible to mitigate vulnerability and avoid disaster, but it starts with learning to accept computers as things that might fail.”

Links

Contact us

Downloads

Share this article

Facebook LinkedIn X WhatsApp