Cyber security and human behavior

How can companies protect themselves from hackers and the cyber security errors of their own people?

 

IC

ABB In Control (IC): You are a behavioral economist who has studied how leaders’ perceptions of risk can bias their cyber security investment decisions. In the power generation industry, plant operations teams are often more aware than their leadership of the cyber security challenges or weaknesses at the plant. What advice would you give operations teams seeking greater investment from company leadership in their plant cyber security programs?

AB

Alex Blau (AB): The biggest challenge that operations teams face is learning how to justify their investment request in terms that are meaningful to their leadership. One important insight from behavioral science is that people are often motivated by information that pulls at their emotions – something called an 'affect bias.' If the justification that is given does not speak of the risks that the senior leadership cares about, the answer will often be 'no.' As an example, imagine that a chief information security officer (CISO) goes to the chief financial officer (CFO) because they want to replace a set of legacy servers. If the CISO simply says that the legacy servers are old and are no longer safe to use, the CFO may not appreciate why it’s important to replace them. If the CISO had instead explained that these servers are used for the organization’s accounting system, and that if the servers go offline the organization would lose all of its accounting information, the CFO might appreciate the real risk to the organization and approve the investment. If senior leadership is not completely bought-in or fully knowledgeable about cyber and technical risks, it’s the responsibility of the operations team to take a step back, understand how those risks map meaningfully onto the risks senior staff care most about, and connect the dots for their decision-makers.

IC

Showing a return on cyber security investments (ROI) is not easy. Why is this and how can chief information security officers (CISO) best deal with it?

AB

Trying to justify cyber investments in terms of ROI is the wrong approach. Cyber investments are made to avoid attacks, but rarely will we know when an attack was avoided – that just looks like business as usual, right? So, how can we put a value on the things we avoid without knowing we avoided them? Instead, I think it’s better to think about investment in cyber as an insurance policy. We shouldn’t be thinking about cyber investment as a way to increase revenue, but instead we should think about cyber investment as a way to avoid potential costs. What companies can do is to consider costs that could be incurred through a disruption to operations brought on by a cyberattack, and then make some informed assumptions about the likelihood of the attack. With this information, you can get closer to how much you should spend to avoid that outcome, and that might be the most important thing to consider. 

IC

Using risk management best practices, can you provide some guidance on how a power generation company should build a resilient cyber security program that invests adequately in people, process and technology?

AB

Really it comes down to a shift in mental models. Investment in cyber security shouldn’t be seen as a risk mitigation process. There’s no amount of infrastructural investment that will prevent cyberattacks forever. There is no way to ‘measure twice and cut once’ when it comes to cyber programs. Instead, organizations should focus on risk management, which I see as having two key components. First, is maintenance – building and maintaining the existing infrastructure, keeping all the systems in check, ensuring staff are adhering to the various policies and practices established by the cyber program, etc. Second, is stress testing – always searching for ways to break, undermine and otherwise disrupt the cyber security program. Both components are necessary, but if you’re only focused on maintaining, you’ll be more likely to get caught unprepared. To be one step ahead of the hackers, you have to find the vulnerabilities before they do and remediate them. Adopting a break-it and fix-it culture will keep cyber teams diligent in ensuring the resiliency of the cyber program.

IC

Having achieved such cyber security resilience, how should that company ensure it maintains best practice? How can it stay at the front edge and avoid complacency?

AB

Cyber security resilience is always short-lived. There will always be another hacker out there who will figure out how to get around what you’ve built. In that sense, no company should ever be satisfied with their level of cyber security. If you think it’s possible to buy all the right hardware and software, then relax and watch as attackers fall by the wayside, you’ll be sorely mistaken. That is why I always advocate for a break-it and fix-it approach. If you’re constantly asking, ‘How can I break this and build it stronger?’ you’ll never be complacent about the quality of your cyber security system.

IC

According to your company’s research, 70-80 percent of the costs resulting from cyberattacks are due to human error. How can companies best protect themselves from the errors and negligence of their own people?

AB

Unfortunately, there isn’t a silver bullet here, but one thing organizations should be doing is to implement robust awareness programs that aren’t limited to one-day training once a year. While awareness training can help to increase employees’ knowledge of cyber risk, knowledge alone does not guarantee behavior change. Instead, awareness programs should really focus on providing ongoing feedback about specific user behavior so that employees can be informed in real time about errors they make and ways to fix their behavior. There are a number of software solutions that already exist to help provide feedback to users about phishing, which usually involve sending fake phishing emails to employees and seeing who opens them and their attachments. Those individuals who fall for the fakes are provided with feedback about their behavior, and given specific information about what they should look out for in the future to avoid this vulnerability. This same principle can be applied to other prioritized vulnerabilities that organizations face. 

IC

Cyber security experts are in demand. There is a shortage of skilled people and competition to hire them. What is your advice to companies that lack skilled personnel?

AB

Well, the economist in me says you should simply offer to pay them more! But, I think that building an internal culture that supports cyber teams and treats them as a core asset across the business is a great starting point. There are too many examples of cyber professionals being treated more like technical consultants than key operational staff. I can imagine that a young and talented cyber expert would much prefer working in an organization that sees them as an integral part of the organization’s risk management strategy, as opposed to another member of IT staff.



Alex Blau is a Vice President at ideas42 and co-author of Deep Thought: A Cybersecurity Story, a ‘true-crime’-style short story that dramatizes human behavioral threats in cyber security. He has extensive experience applying insights from behavioral science to solve design and decision-making challenges in a broad array of domains. His current foci at ideas42 are in the areas of cyber security, financial inclusion, public safety and A/B testing (comparing two versions of a Web page to see which performs best). www.ideas42.org
 
Alex Blau

Vice-President at ideas42, a US-based behavioral science research organization, looks at cyber security from a behavioral perspective.

Share this page

Index: In Control - 01 | 2017

Are you looking for support or purchase information?

ABB solutions

  • Contact us

    Submit your inquiry and we will contact you

    Contact us
Select region / language