30 years of security awareness: 11 lessons learned
A few months ago, as I watched an online presentation about how social engineering works, I was saddened to hear the presenter, a respected practitioner, make recommendations about how to prevent social engineering that were nearly identical to those I made in a paper I delivered at the USENIX UNIX Security Symposium back in 1995.
Apparently, little has changed in the last 24 years, at least in the mind of the average security practitioner. There are things that work, but in general security professionals seem to have not made any significant inroads beyond, essentially, "Tell users not to do that."
I believe that most security awareness measures are generally ineffective. However, over the last 30 years I have come across security awareness methods that consistently work to improve awareness—as well as things that don't work.
What follows is a sampling of the lessons I've learned. I'll be talking about them in more detail during my upcoming RSA Conference keynote.
1. Social engineering is not the opposite of security awareness
Although I got my start in social engineering, I began to realize that my social engineering findings were not overly useful for awareness efforts. I will admit that while I told interesting stories to audiences disclosing how I pillaged their organization, and then made some recommendations that sounded awesome given the stories, those recommendations were rarely effective in making a significant dent in the problem.
Just as telling people to just patch vulnerabilities ignores the organizational challenge behind consistently implementing security patching throughout an organization, telling people, “Here are tricks social engineers use, so don’t fall for them,” both is limited and does not address the immense difficulty in changing longstanding behaviors across potentially hundreds of thousands of employees.
Although, while performing social engineering, I found some vulnerabilities that clients needed to mitigate, it is significantly more challenging to change user behaviors than it is to discover that user awareness is poor. I can do that without even doing social engineering.
[ Special Coverage: RSA Conference 2019 ]
2. Security awareness is a risk-reduction strategy
Over the years, I have been asked to participate in debates about the usefulness of security awareness training. Despite the fact that I am critical of traditional awareness efforts, I vehemently defend awareness as a critical security countermeasure.
While there will almost always be a user who "fails," the predominance of other users who do not fall prey to attacks or make mistakes yields tremendous return on investment. Consider what would happen if 20 out of 20 users clicked on a phishing message instead of one out of 20.
3. Behind every stupid user is a stupider security professional
It is easy to blame users for causing the damage they may have done. But the reality is that for users to have created damage, security professionals had to have not accounted for the potential damage that users can create. In many cases, security professionals fail to understand that the average person does not have the same base of knowledge that they do.
Also, for users to create damage, the potential attack must be able to reach them, which likely involves several layers of technological failures. Then the systems in place have to fail to mitigate users' actions. Security professionals may like to claim that the user is the weakest link, but failure to account for the likely damage makes security professionals the weakest link.
4. Don't lose sight of the real goals of awareness
While awareness professionals like to think their goal is a more aware user, the goal should be to reduce losses. It doesn't matter if users know why they are doing the right thing, as long as they are doing it.
You teach children to brush their teeth because it is the thing to do. They don't need a science lesson in tooth enamel, decay, and the bacteria that break down sugars into acids.
It's the same for security behaviors. Focus on what to do, and if you have to say it, there is nothing wrong with, "Because it’s part of your job," when they ask why.
5. Reliance on bro-science vs. behavioral science
I have seen awareness practitioners attempt to improve their methods by looking into, and applying, sciences such as psychology and mental models. The way they put it into practice is "bro-science."
The problem with mental models and traditional psychology is that they are trying to understand and influence individuals, while security awareness practitioners should be trying to understand and influence the organization. It is literally impossible to simultaneously address the learning styles of all individuals in an organization. This means science-minded awareness practitioners should be studying sociology and applied behavioral science.
Unfortunately, most practitioners who yearn for a scientific basis for their work don't understand the difference.
6. Don't treat security practices as a 'should,' rather than a 'must'
"We can’t blame the user" seems to be the mantra for many security teams, who also don't want to mandate security practices to users.
This is asinine.
When people don't properly fill out their timecard, they don't get paid. If they watch pornography on their computers, they get fired. Practices that can result in significant damages to the organization's network and information should be treated the same way. Security practices have become a should instead of a must.
7. Gamification does not mean a game
Getting back to the bro-science aspects of awareness, there has been a lot of discussion about gamification. Gamification is the application of gaming principles to accomplish a business goal. But gaming principles do not mean making up a game and calling it gamification.
True gamification is about creating a reward structure to encourage good behaviors. It is in essence applied behavioral science, where awareness drives behaviors and behaviors create the culture.
8. Learn from the safety field
The safety field actually studies behaviors that create improvement. There is a great deal of research in the field, since workplace injuries cost a great deal of money.
Practitioners in the safety field know that 90% of workplace injuries result from the environment—conditions that are conducive to causing injuries. The remaining 10% results from carelessness or a failure to follow procedures.
People focused on solving the human aspects of security should look at their job in a similar manner. Someone must look at the computing environment surrounding the user and figure out how to mitigate the opportunity for users to cause damage. This includes both the technology and the processes that provide users with an opportunity to create damage.
When that damage cannot be prevented proactively, awareness must kick in to let users know specifically how to perform their job functions.
9. Culture is the best awareness tool
You can have the greatest awareness program in the world, but if a user walks into the workplace and everyone is doing the wrong thing, that user will eventually do the wrong thing as well. Likewise, in the absence of any awareness program, if everyone behaves securely, a new person will walk into the organization and mimic those secure behaviors.
Create a secure culture, or any individual awareness efforts will be moot.
10. Security awareness has massive impact
Successful awareness efforts are rarely seen, since there is nothing to see, while awareness failings are easy to spot. But every phishing message that goes unclicked is an awareness success. And every time a user properly handles information, or checks an outgoing email address, it's a security awareness success.
There are ample opportunities for users to make mistakes, but the actual number of failings is small.
11. Awareness alone doesn't solve the human security problem
All awareness and/or security professionals have to consider that they need to look at the problem beyond awareness to include the processes that allow users to create damage and the technology, or lack thereof, that provides users to make mistakes and create damage.
Time to apply those lessons
These are just a sampling of the lessons to be taken from my session. Clearly, these lessons came from experiences that I intend to share to place context around these observations. Ideally, by sharing my personal transformation in how I approach awareness, you will be able to address the overall human security problem.
For more on these lessons and more, see my keynote, "Lessons Learned From 30 Years of Awareness Efforts," at RSA Conference 2019. I'll go into more detail on the tips above and have a few other recommendations as well.