It’s a common cybersecurity cliché that humans are the weakest link in an organization’s defenses. All the tools, technologies and processes mean nothing when a wayward click can give attackers access to vulnerable systems.
However, despite this, organizations often overlook the human element when securing their operations and fail to build a strong security culture. Their employees lack proper training or use unfriendly systems, making errors more likely and increasing the potential for users to circumvent controls with dangerous workarounds.
With people so critical to security, there’s one group that doesn’t overlook them: cybercriminals. They will exploit any weak point to penetrate or evade defenses. Social engineering techniques, which manipulate people to give hackers access, drive the majority of cyberattacks – 99%, according to Proofpoint The human factor report.
When breaches occur frequently due to mistakes made by an individual, a strong security culture is as important as good cybersecurity tools.
For award-winning cyber-anthropologist Lianne Potter, speaking at DIGIT’s Scot-Secure 2022 March 23 conference, the human element is the foundation of cybersecurity. And it’s a problem that security teams overlook at their peril.
Weak links
While it may be a common misconception, saying that people are the weakest link in cybersecurity, Potter warns, can actually weaken an organization’s security culture.
“It always makes me cringe when people call other human beings the weakest link,” she says. “Terms used by security teams, such as insider threat, least privilege, zero trust, don’t endear us to others.
“So how are we going to get people on board if we use terms like that? »
This is what prompts Potter to warn that the biggest threat to cybersecurity is the inability of cybersecurity practitioners to work well with other teams.
“It’s that kind of thinking that keeps people from engaging with us,” she explains. “We need people on the first line of defense because they see the things we can’t have – day-to-day surveillance.”
Build a culture of safety
Potter uses lessons from the social sciences to identify gaps in safety cultures and areas for improvement.
She notes that when establishing a security program, it is essential to get a good onboarding of the organization because it is often your first point of contact with a colleague and a perfect opportunity to integrate security. from the start. After all, most staff will only engage with the security team at startup or during an incident.
One technique Potter uses is telling people stories about all the times she’s been hacked.
“It makes the new person realize that if it can happen to someone who lives and breathes safety, it could happen to anyone.
“It allows people, when something happens to them, to come to us with questions. And it opens that door in a way that telling them what they should and shouldn’t do achieves nothing.
emotional response
Fear, uncertainty and doubt have long had a place in propaganda and manipulation. For cybercriminals, these are key emotions to manipulate in their victims. For cybersecurity professionals, dispelling them is vital.
That said, fear can be a powerful tool to build security. It emphasizes the seriousness of cybersecurity and the potential consequences of a breach.
But it can also erode trust between staff. After all, if people are worried about getting in trouble for making a mistake, they are likely to cover up their actions or delay reporting.
The security team’s ability to cooperate with other groups within an organization is critical to building a culture of security. And above all, it requires everyone to trust each other.
“One of the hardest things to do when you have a security program in place is to get people to trust you to report things,” says Potter. “You have to embrace a culture that says it’s okay to report, that you’re not going to get in trouble if you report things.
“The last thing you want as a security professional is for someone to click on something and sit on it. Once you have a culture where that happens, a culture where people are immobilized by fear, you lost and the cybercriminals won.
The question is how to build a strong safety culture, where everyone is trustworthy, responsible and invested.
“The security team is the best place to start,” says Potter. “There’s no point creating a new security culture if your security team isn’t up for the journey.
“And to do that, we need to look at how they engage with each other? Do they support each other in their own efforts as a team? Because if that doesn’t happen, then how can we expect everyone to support us? »
give and take
Reciprocity is a principle Potter uses to establish trust and cooperation between teams. It involves doing favors to establish a relationship with a team and, in turn, receiving similar favors down the line.
“In terms of security, I’ve already given access gifts,” she explains. This involves approving the use of tools, data or systems, especially where previous responses have been negative.
“If I give away access to a tool that was previously denied, the other team will want to do something in return. And that could be something as simple as helping me with a management analysis vulnerabilities.
With greater cooperation comes greater adherence to security. When everyone is on the same page, understands the threats, and is confident in their roles and abilities, people are happier to be held accountable for their actions.
“Accountability to safety is the gold standard,” says Potter. “But people don’t often volunteer to be held accountable on a RACI matrix; it’s one of the biggest struggles in security. Indeed, responsibility often means that at some point you are going to be blamed for something or that you will eventually have to explain your actions.
“If you don’t give people the psychological safety to be able to ask for help, to be able to challenge things or give them the tools they need when something finally goes wrong, they’re going to have to justify their actions.
“And if you don’t give people that environment where they feel like they could say I did this for that reason, then it’s no wonder nobody is responsible.
“But that’s where reciprocity comes in,” she continues. “If you start building those relationships slowly, you can see more people wanting to be accountable because they know that even if something happens, and maybe they made a bad decision, they can justify it. “
Learn more
Another technique used by Potter is participant observation. Typically, this involves integrating into another team and observing how they operate.
“The idea is that you become so grounded that you actually become part of their world and they don’t see you as an outsider anymore,” she says. “And then you start to see all the pain points and the blind spots.”
This helps the security team form a realistic picture of typical organizational behavior.
“What happens too often is that security policies are based on a lot of best practices,” Potter warns. “Which is great, you need it. But you also need to back that up with your understanding of what it’s really like to live and breathe and work in your organization across different departments.
“Sometimes the best practices aren’t necessarily the best fit for your business needs. So you need a security team that can take a broader strategic view. »
Tools to develop a safety culture
So where do tools and technology fit into a good security culture? While they are undoubtedly powerful, they should be used correctly, in a way that supports their users.
“I’m all for tools that take out the boring stuff. I’m all for automation,” Potter says. “But it has to be done with a contextual understanding.”
Take phishing emails, for example. Simplicity and ease of use are key to reducing the time and thought required to deal with a potential attack.
“People will always choose the path of least resistance, something more complex, and people either avoid it or push it away,” Potter says. “You want to reduce as many barriers as possible to people reporting phishing emails.
“What I try and encourage is to create a reporting scheme that just forwards via email. Anything with more steps won’t happen.
“It’s almost like training people to be a little more automated. I want the user to think, “I’m not sure about this email, but I want to move on”, and they can put that email directly into the phishing reporting system.
“We as a function need to take ownership of security issues and avoid putting so much emphasis on explaining ‘why’ we do things as a security team – after all, we are the ones who don’t care. care, everyone has other priorities. All we should care about is that our colleagues know how to react or how to avoid.
“The why is often of little interest to ordinary mortals…unless you have a good story to tell!” And cybersecurity is certainly full of them!
Scot-Secure 2022 | Cyber Security Conference
The 8th Annual Scot-Secure Summit will take place on March 23 at Dynamic Earth in Edinburgh and will be streamed live on our virtual conference platform.
The program will focus on promoting cybersecurity best practices; examining current trends, top threats and offering practical advice on improving resilience and implementing effective security measures.
To learn more and register for the event, click here.
Related