Why are people still getting phished?

Matthew Burbidge
By Matthew Burbidge
Johannesburg, 27 Aug 2020
Suelette Dreyfus, academic specialist, University of Melbourne.
Suelette Dreyfus, academic specialist, University of Melbourne.

We were not long into 2020 and there were already predictions of a sharp increase in cyber crime. Eight months later, these predictions have proved correct, with an estimated seven times increase in spear-phishing attacks since the pandemic and subsequent working from home.

Why do people still click on phishing mails, and what lessons can security professionals learn?

Suelette Dreyfus, an academic specialist at the School of Computing and Information Systems at the University of Melbourne, provided some signposts for security teams this week. And, it appears, IT teams have plenty of catching up to do, especially when it comes to persuading staff to toe the security line.

Speaking from Australia at this year’s ITWeb Security Summit, she quoted from a study run by the University of Bath at a public sector organisation in the UK with over 50 000 employees. She says the organisation was particularly suited to a research trial; it was both public facing, and also handled sensitive information. All these staff were sent at least two simulation phishing e-mails, which were personalised (‘Dear Mary’ for example. The mails invited the staff member to click on a link to ‘the UKs most secure e-mail platform’ to ‘access and read your secure mail’.

As it happened, just under 20% of the staff members clicked on the link – a fairly typical figure – after which they were directed to an internal cyber educational site.

This, said Dreyfus, reflected the organisation’s ‘no blame culture’, about which there is some debate in the cyber security industry.

“Some have the view that those people who don’t take cyber security seriously should suffer severe penalties. There’s a number of managers who have been asked to pay the cost of the outcome; a decision (which they made) through ignorance, or laziness or busyness.”

Cyber security has to wrap around the processes of the human, not impose from the top down.

Suelette Dreyfus

Nevertheless, she believes a non-threatening culture tends to produce more advantageous outcomes, one of which is that the senior managers gets better visibility into what’s actually happening in their organisation.

With a no-blame culture, “People are more willing to report an incident (or mistake) honestly.”

She says if enterprises know what phishing methods are likely to succeed in their organisation, then they can concentrate on those, and educate their staff.

Do you know the sender?

Another study took place in an organisation in the engineering and management sector, with more than 10 000 employees. People were asked what factors made them more – or less – suspicious about a potential phishing mail they’d received. As to be expected, they said they were less suspicious if they were familiar with sender of the message, or had been expecting the mail.

Asked what factors made it more likely that staff would respond to a phishing mail, one response was ‘operating within a pressured cognitive context at the time the message was received’. In other words, they were busy.

It’s also very important to make it easy for a staff member to report a suspect mail.

One respondent said they had thought ‘we’re meant to report all spam’, but then struggled to find the relevant information on the company’s intranet.

Users care deeply about timely and reliable feedback about phishing e-mails from the IT department’s security team.

“This makes them feel that reporting actions aren’t a waste of time.”

One respondent said: “If you’re not getting feedback at all, then you’ll stop forwarding them. It makes you think: ‘Are they (IT) paying any attention?’

“These feedback loops are super important if you want the staff to continue to engage on your cyber security agenda. If it’s effortless, people are more likely to do it. Frictionless in design actually counts,” Said Dreyfus. “Cyber security has to wrap around the processes of the human, not impose from the top down.”

The mail tsunami

Interestingly, staff members who regularly receive external e-mails found it more difficult to determine their authenticity.

Surely, she said, people getting a lot of external mails, would ‘wise up?’ In real life, any awareness of phishing was often overwhelmed by the constant ‘tsunami’ of mails.

As one participant said, they get 200 to 300 mails a day, and ‘knowing when to click something, or not to click something, is quite hard’. We get purchase orders coming through, and we have to click on it to open it as an attachment’. Another said: “If I’m very busy, I might just click on it by accident.”

The office environment is also perceived as being more secure, and, so, goes the thinking of the employee, there is less chance they’ll encounter a suspicious mail. This can lead to a false sense of security, or: ‘Someone else will deal with it; it’s someone else’s risk’. This, in the midst of a pandemic, should lead to IT to encouraging employees to follow good security practices across all their devices, both on those belonging to the company and their own.

She said that banners, and warning pop-ups also proved useful, and had proved to be effective.

Specificity is also important, and the study showed that if security teams sent alerts about a particular threat – from a particular address, for example – employees usually listened.

“What cyber security culture do you have in your organisation? Is there a fear of potential negative repercussions?

One person in the study said: “I know people who haven’t wanted to report things because they felt they would get into trouble for clicking on something.”

As Dreyfus says, IT needs to decide what kind of environment it wants in the wider organisation.

“You can have truth, or blame, but you can’t have both.”