Stop blaming the user for cybersecurity failings
Panellists look at ways of instilling an organisation-wide security culture
There is an unfortunate tendency to blame the user for IT security gaffes such as falling for a phishing email or writing passwords on a post-it note. But what this finger pointing really shows up is the inadequacy of cybersecurity provisions and awareness training.
The speakers in a panel debate during Computing′s Enterprise Security and Risk Management Live event all agreed on this point. There is a need to create a proper security culture within organisations.
According to Dr Louise Bennett, management committee member of the Information Assurance Advisory Council (IAAC), IT security needs to be treated much more like corporate health and safety, with clear lines of responsibility.
"Right from the time that you employ someone, you need to make clear ′this is what we think your responsibility is towards the overall security of our company′," she said. "You need people to understand what they are responsible for, what the IT department is responsible for, what the management team is responsible for, and what you're expecting your provider or be responsible for. You need absolute clarity on that."
For Andjela Djukavonic, analyst in technology risk consulting at KPMG coaching people on the risks is key. Social engineering attacks through Facebook and LinkedIn are a particular risk, but awareness training should not be limited to cyber.
"We should be doing these regular tests, sending out test phishing emails and also seeing if someone can walk into your office with no-one asking who are you and what are you doing here," she said.
When someone fails such a test, treat them as adults, not like naughty children, urged Dan Cuthbert, global head of cybersecurity research at Santander Group.
"Security has had a very negative approach, it's the ′Department of No′. I think this is actually terrible. At Santander we remove that negative language and instead say: this is why we do this or don't do this," he explained, adding that people will not be motivated to speak up if they′re made to feel they′ve been stupid.
"With ′Security says no′ the effect is for people to say ′well Security can deal with it then′. We should fundamentally change message, and the biggest thing is to stop saying ′no′."
Once people have an understanding of the risks they will be more likely to retain that knowledge, but there is much debate about the best way to train staff in security best practice. All panellists agreed that a tailored approach is necessary, and Tarun Samtani, group data protection officer for retailer Boden Group, argued for a rewards system, ranging from a thank you letter from senior management to financial rewards for reporting valid issues.
"Say there′s someone who has reported five issues and no-one else has, we give them a hundred-pounds voucher, saying this is your award for the month because you′ve done such good work," he said.
Gamification can be an effective approach, Samtani went on, with users asked to complete levels, as in a video game.
"You define where you want your messages and culture to be, then at level one you can have some basic messaging and threats, making sure you don't click on phishing emails, then increase the levels as time goes on to reflect the improvement in their learning. That makes them feel more responsible."
Personalised training that makes people aware of the sort of thing they are likely to see in their role is good, Djukavonic said, but the messaging should be kept simple, and there should be a clear ‘next step' should a user be suspicious: " It's knowing who to talk to about it and where to send the email on to so it can be checked."
For Cuthbert, it′s about finding ways to get people to wise up to social engineering, particularly in certain roles, and to stop them trusting people they've never met. He alluded to recent attacks in which money was stolen from a number of banks via the Swift bank transfer network.
"North Korea really likes us. Kim has put together one of the best attacking teams. He's very good at targeting key individuals in an organisation who have access to Swift, and he does so via LinkedIn," Cutherbert said.
"LinkedIn is a huge problem because it's the nature of sharing connecting and abusing trust relationships. So we′ve had to go off to our key Swift people and say this is what is possibly going to happen and here are the tell-tale signs. That person you think is a friend could be somebody sitting in Pyongyang."