Council Post: How Cybersecurity Experts Can Nudge Users To Make Safer Choices

Perry Carpenter is Chief Evangelist for KnowBe4 Inc., provider of the popular Security Awareness Training & Simulated Phishing platform.  

As a cybersecurity professional myself, I know that experts in this industry can often feel frustrated over the actions — or inactions — of employees they depend on to help keep their organizations’ data and systems safe.

We spend a lot of time educating on and informing about the importance of having a complex password and not using passwords such as “123456.” We tell teams not to click on emails that, while compelling, could represent security risks. And yet, what do many employees do? They make mistakes — because they’re human.

Workers might have the knowledge they need to make wise decisions, but that knowledge and intention don’t always naturally translate to making the right decision or doing the right things. That’s what I call the “knowledge-intention behavior gap.” From my perspective, it’s a fundamental reality for human beings: Even the right knowledge and intention might not translate into the desired action.

So, what can cybersecurity professionals do to combat this? You must start by understanding behavioral economics.

What Behavioral Economics Reveals About Human Behavior

Daniel Kahneman is a Nobel Prize-winning psychologist known for his contributions to the field of behavioral economics, a method of applying psychological concepts to help us understand why people make the economic decisions they do.

One of the foundational tenets of behavioral economics is the tendency for humans to use mental shortcuts, also called “heuristics,” when facing complex decision-making. These heuristics can be good because they serve to help us process information quickly, but they can also lead people to make poor decisions.

Kahneman described two types of thinking: System 1 and System 2. According to Scientific American, “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.” System 2 thinking, on the other hand, “allocates attention to the effortful mental activities that demand it, including complex computations.”

Applying Behavioral Economics To Cybersecurity

So, what does this mean from a cybersecurity standpoint? To ensure that people will exhibit good security behaviors (or make good security decisions), I believe cybersecurity professionals need to recognize that many users act on autopilot, so building behaviors that naturally rely on System 1 thinking is key. (Security teams can also try to get people to slow down and intentionally engage in System 2 thinking, but I’ve found adapting to System 2 thinking can be challenging.)

By working with System 1 thinking — or the idea that we make decisions on autopilot — we can begin to build systems and processes to help people act in ways that can protect, rather than potentially damage, our security systems. This will require a multi-pronged strategy that includes both technology and behavioral guardrails. Importantly, this is less about purchasing security technologies and more about adopting a mindset that is aware of human nature. It’s about looking for technologies and tools that will naturally encourage the right behaviors.

Some key questions to help you head down the right path include:

• What precise behaviors, if adopted, would have the greatest security benefit for your organization?

• Is this a group of behaviors, or is this a single behavior?

• Is this a behavior that we have the appetite to take on right now?

After answering these questions, you can begin to tackle the most critical issues by using System 1 thinking and giving users a “nudge.” By “nudge” (registration required), I mean changing how you present a choice to help team members make safer cybersecurity decisions.

A simple example of this is what happens when choosing a new password. If the password system is not satisfied with your choice of a simplistic or short password, you’ll often receive the message, “Not Great,” “Too Weak,” “Too Short,” etc. We’ve all seen it: That little prompt, or nudge, is what encourages you to choose a stronger or smarter password.

Of course, finding the right “nudge” can quickly become complex. But I hope this small example gets you thinking about how you can turn autopilot thinking into an advantage. Consider human behavior when designing prompts and nudges to push people down the pathways of doing the right thing.

The key takeaway is that we can leverage human behavior, through principles of behavioral economics, to encourage safer security practices.


Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?


Speak Your Mind

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Get in Touch

350FansLike
100FollowersFollow
281FollowersFollow
150FollowersFollow

Recommend for You

Oh hi there 👋
It’s nice to meet you.

Subscribe and receive our weekly newsletter packed with awesome articles that really matters to you!

We don’t spam! Read our privacy policy for more info.

You might also like

Asian Stocks Open Lower As Faith In Global Recovery...

NEW YORK: Asian stocks opened lower on Thursday, tracking a sharply lower Wall Street...

The Brewer’s Association And Bottleshare Create the ‘Believe In...

Head Brewer Daniel Vilarrubi pours a beer at Atlas...

Get This New Bourbon While You Still Can

Colonel E.H. Taylor 18 Year Marriage Buffalo...

IMPACT Wrestling Results: Winners, News And Notes On October...

The Good Brothers battled The North in the IMPACT...