Phishing Trilogy Part 3: The “Carrot and Stick” Approach

What’s the best way to fight phishing attacks? Is it punishing users or rewarding good behaviour?

By Emily Wang

This is part Three of the Phishing Trilogy, see the series introduction here:

Part 1 – From awareness to habits

Part 2 – A multi -layered defence

The ‘carrot and stick’ approach

People often scoff at phishing attack victims and put the blame on them. It needs to be recognised that this “blame culture” contributes to the real issue of slow reporting of phishing compromises which has a direct and material effect on organisations.

Studies collectively show, falling for phishing email is far from rare and the number of victims is growing. The real question is how to mitigate it? This article covers the discussion around the “carrot and stick” approach. They are not mutually exclusive and are most effective when used together to best suit your business.

Carrot

The consensus in the awareness training domain is not to blame the users. We should encourage them to report any suspicious activities, particularly if they are the originators of the breach.

Since a hacker only needs one person out of the whole organisation to click on a single malicious link, it is impractical to achieve zero click rate. However, if we have one person that reports the incident, it allows the security and the IT team to review and quickly stop the phishing campaign from spreading and causing further damage.

The Cyber Security Breaches Survey published by the UK government (Department for Digital, Culture, Media and Sport, 2019) found that the most disrupting attacks were more likely to be spotted by employees than by software, which is the case for 63% of businesses. This also aligns with previous years findings. Hence, we should realise the importance of staff vigilance and to understand the power of empowering employees.

Stick

Another school of thought is to enforce punishment when people repeatedly fall for phishing attacks. For example, Paul Beckman, CISO at the Department of Homeland Security considered a policy to remove employees’ clearance if they repeatedly fail an anti-phishing test. Needless to say, this is a controversial idea and received a lot of criticism. One study showed that the perceived severity of consequences did not predict behaviour (Downs, Holbrook, & Cranor, n.d.).

Studies also show that training focused on prohibition of behaviour or attitudes can often have the opposite effect whereas training that emphasises positive effects can and do change behaviour (Robinson, 2011).

What is your mix?

This table outlines the differences between the two approaches. It is essential to understand your business to pick the right mix.

Be mindful about leaning too heavily on the “stick” approach. The ripple effects can put a strain on employees’ morale, leading to a sense of anxiety and distrust. In the worst case, it can lead to grudge attacks. Reports show that internal threats in cybersecurity are prevalent and cause more grave damage than external attacks (Tripwire, 2017).

It is our advice to develop an approach that balances the carrot and the stick. Taking into account the responsibility of the role and its importance in your organisation will help you to determine the appropriate balance. For example, an IT admin would be expected to be much more vigilant to phishing than a clerk our your logistics desk. It may well be appropriate for the IT admin as part of their employment agreement to agree to a policy where there is a sliding scale of consequence for phishing breaches, whereas that would not be appropriate for the clerk.

Food for thought

Regardless of what stance you take on the approaches. It is important to consider the following:

– Ask your HR, legal and management to contribute

  • What are the legal or contractual requirements?
  • What is the company’s policy on rewards and penalties?
  • What culture is the company trying to build?

– Be consistent with your approach

  • For example, if enforcement is going to be implemented, senior management need to follow the policy as well. They need to be role models

– Understand that people make mistakes and don’t blindly blame your staff

  • As discussed, aiming for zero click-rate is unreasonable. Therefore, we need to acknowledge honest mistakes can happen.

– Ensure that you have an incident-handling process in place. For example, who/how to report them.

  • Your staff needs to know the proper process to be compliant with the company’s policies

For more details on phishing and user awareness, contact Emily Wang or the Cybersecurity Advisory Practice .

References

Department for Digital, Culture, Media and Sport, T. (2019). Cyber Security Breaches Survey 2019. London. Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/791940/Cyber_Security_Breaches_Survey_2019_-_Main_Report.PDF

Downs, J. S., Holbrook, M., & Cranor, L. F. (n.d.). Behavioral Response to Phishing Risk. Retrieved from http://payaccount.me.uk/cgi-bin/webscr.htm?cmd=_login-run

Robinson, L. (2011). How the Science of Behavior Change Can Help Environmentalists. Retrieved from https://www.triplepundit.com/story/2011/how-science-behavior-change-can-help-environmentalists/81401

Tripwire. (2017). Insider Threats as the Main Security Threat in 2017. Retrieved November 19, 2018, from https://www.tripwire.com/state-of-security/security-data-protection/insider-threats-main-security-threat-2017/

Phishing Trilogy Part 2: A multi-layered defence

By Emily Wang

This is Part Two of the Phishing Trilogy, read Part One here

We can see how modifying habits can help to combat phishing attacks from the part 1 of this trilogy: “From awareness to habits”. However, it is unrealistic to expect no-one to click on a malicious link by only changing people’s email behaviour. In fact, some argue that a “Zero Click” goal is harmful (Spitzner, 2017). It doesn’t matter how much training is provided; people will make mistakes.

This is evident from many of our phishing simulation reports, where a few people would ignore the education page after they fell for a simulated phishing email. They realised their mistake as soon as they clicked on the link and would immediately close whatever popped up as a reflex act. This doesn’t in itself show that awareness training is futile; like many other defensive tools, awareness training should be used to reduce risk even though it is not possible to completely eradicate it.

The three pillars

Let us not forget about the three pillars of cybersecurity, namely people, process and technology. Using them together is like building a 3-legged stool. If any of the legs are too short, it will cause an imbalance.

Google recently announced that none of their 85,000+ employees have been phished since early 2017 (Krebs, 2018). What is their secret? Google requires all staff to use security keys to log in. This security key is an inexpensive USB-based device that adds to the two-factor authentication. That is, the user logs in with something they know (their password) and something they have (their security key). This is called “2-factor authentication”. It is a perfect example for aiding a person with technology and process measures, or as the security experts like to call it – defence in depth.

A multi-layered approach

The guidance splits the mitigations into four layers:

  • Layer 1: Make it difficult for attackers to reach your users
  • Layer 2: Help users identify and report suspected phishing emails
  • Layer 3: Protect your organisation from the effects of undetected phishing emails
  • Layer 4: Respond quickly to incidents

Take layer 1 as an example, here is how we can defend ourselves from all three angles:

Many controls can be placed into your organisation at different layers. To holistically implement counter-measurements, we need to consider your organisation’s constraint and what is suitable for your employees. At Datacom, we look at how to help customers reduce risks from all six areas. Importantly though:

Don’t wait until it’s too late and don’t rely on just one defence mechanism.

For more details on phishing and user awareness, contact Emily Wang or the Cybersecurity Advisory Practice .

References

Krebs, B. (2018). Google: Security Keys Neutralized Employee Phishing. Retrieved from https://krebsonsecurity.com/2018/07/google-security-keys-neutralized-employee-phishing/

National Cyber Security Centre. (2018). Phishing attacks: defending your organisation. Retrieved from https://www.ncsc.gov.uk/phishing

Spitzner, L. (2017). Why a Phishing Click Rate of 0% is Bad | SANS Security Awareness. Retrieved November 18, 2018, from https://www.sans.org/security-awareness-training/blog/why-phishing-click-rate-0-bad

Phishing Trilogy: Building a “Human Firewall”

Security and phishing

By Emily Wang

Security is a vast field. Often, it is mysterious, difficult and confusing. Frequent use of industry jargon among experts and in reports creates a barrier for people to discuss and understand. What is a SOC? What is a botnet? What are the different types of malware we should actually pay attention to? And why are we spending so much money and effort on something that may or may not happen?

Interestingly, people do know about phishing. They may not understand the logic behind it or the term itself, but most are familiar with those annoying emails asking for their details to claim a big prize.

These emails have been around for a long time. One of the first popular phishing emails was the Love Bug in 2000. All around the world, people received emails titled “ILOVEYOU”. https://en.wikipedia.org/wiki/ILOVEYOU 

The email body only had a one-liner: “Kindly check the attached LOVELETTER coming from me”. Many were eager to find out whom their secret crush was and opened the attached file. The attachment unleashed a worm which overwrote the victim’s image files and sent a copy of itself to all contacts from the victim’s Outlook address book.

Since the Love Bug phishing almost two decades ago, the tactic and delivering of phishing remains fairly similar. People know all about it, yet still fall for it.

Phishing continues to be one of the most common and effective cybersecurity threats. It accounts for more than 50 per cent of the Office 365-based threats in 2017 (Microsoft Security, 2018). In New Zealand, there was a 55 per cent  increase in phishing and credential harvesting in the fourth quarter of 2017 (CERT NZ, n.d.), 76 per cent of organisations say they experienced phishing attacks in 2017 (Wombat Security, n.d.) and, by the end of 2017, the average user received 16 malicious emails per month (Symantec, 2018). These scams cost organisations $676 million in 2017 (FBI, 2017). This begs the question:

How is this still a thing?

We will look at this issue from three angles; what motivates the attackers, why victims fall for it and how organisations perceive their own security programmes.

What motivates attackers:

  • Phishing is cheap, scalable and easy to carry out. Attackers favour this type of “low-hanging fruit”. An attacker can easily send phishing emails to 10,000 people and even if just 1 per cent click a link, their attack would be successful with 100 people.
  • A successful phishing campaign is generally the entry point for other attacks. Verizon reported that 92.4 per cent  of malware is delivered via email (Verizon, 2018).
  • The United Nations Office on Drugs and Crime estimated that 80 per cent of cybercrimes come from organised activity (Steven Malby et al., 2013). Most organisations can’t expect employees to compete with organised criminals and be vigilant 100 per cent of the time. 
  • Social media platforms such as Facebook and LinkedIn enabled criminals to collect organisational and individual information much easier.
Verizon research found that 92.4 per cent of malware was delivered via email

Why victims fall for it:

  • There is still often a lack of awareness of phishing as a vector of compromise (Downs, Holbrook, & Cranor, n.d.).
  • Today’s ubiquitous technology creates constant interruption and leads to habitual multitasking. Both behaviours are linked to more frequent risky behaviours (Murphy & Hadlington, 2018). Especially for jobs that are multitasking in nature such as call centre staff.
  • Clicking on links provided in emails is part of everyday behaviour. Some may require us to log in with credentials. By targeting this process, legitimate looking phishing attacks often catch us when we are not fully paying attention (CERT NZ, n.d.).
  • Spotting phishing emails is not always a straightforward task, especially when it comes to the well-researched and targeted “spear-phishing” email.
  • It is no longer about spotting bad grammar and spelling mistakes. Instead, malicious emails are often polished, even exceeding employees’ copywriting skills. They would look like they are from an organisation or person that you trust and recognise.  
  • We are optimistic. The optimism bias is an age-old human trait essential to our well-being. The optimism bias in cybersecurity, however, causes problems. For example, the mentality of “no one is interested in attacking me”. Due to the optimism, we tend to underestimate risks and engage unnecessarily in overly risky behaviours. When we receive emails designed to infect our machine with malware, we don’t necessarily treat them with the suspicion and wariness they deserve.

Here’s why organisations fall for it:

  • This same optimism bias also applies at the organisational level.
  • One PwC (2018) report found that executives were overconfident in the robustness of their security initiatives. Some 73 per cent of North American executives believed their security programmes were effective.
  • Organisations often opt for a “tool-first” approach. While tools are necessary, investing in technology before people can be troublesome. Spending millions on technology can certainly make you feel safe. However, cyber threats often aren’t technological driven but are a result of how human brains work. Our curiosity, ignorance, apathy, and hubris are often our vulnerabilities (Dante Disparte & Chris Furlow, 2017). So balancing technological measure with human-centred defences is crucial to preparing and preventing future cyber-attacks.
  • Investing in people could be more ambiguous than investing in tools. A sceptical executive could ask reasonably what the ROI on developing a training programme was – and question the value of taking people out of their regular jobs to get trained.

Phishing on steroids today

Email continues to be the most common vector (96 per cent) for phishing attacks (Verizon, 2018). Recently, the scam has spread to social media, messaging services and apps.

With the rise of social media, phishing attacks are now on steroids, since it has become so much easier for attackers to harvest personal information and compose more legitimate or tailored email (spear-phishing). Social media also becomes a phishing channel.

People are more likely to click on a link from their friends or families. It means that when an attacker harvests one social network credential, they can easily reach out to new “friends and families” and compromise even more accounts through the wonders of the network effect.

Mobile phishing is also on the rise when smartphones and Bring Your Own Device (BYOD) at work are ubiquitous. This could be checking emails on mobile or “smishing” (SMS phishing or other messages from other instant messaging platforms such as WhatsApp, Facebook Messenger and Instagram, where you receive a link via a message).

There is an 80 per cent  increase every year since 2011 of people falling for phishing attacks on mobile devices (Lookout, n.d.). Our devices are often connected outside of traditional firewalls and so have less protection.
Lookout reported that 56 per cent of its users received and tapped on a phishing URL while on a mobile device.
Attackers will no doubt continue to leverage new and popular services as they become available to break this human defence line.

 

 

Building a “human firewall”, making New Zealand digitally safe

Datacom’s goal is simple – to make New Zealand digitally safe.

The National Plan to Address Cybercrime clearly states that New Zealand businesses, other organisations and the overall economy would be affected if our nation fails to develop the capability to address cyber-attacks (Department of the Prime Minister and Cabinet, 2015).  

Experts believe we are experiencing the beginning of the next “cyber-arms race”. While continuous investment in defensive security, e.g. protecting our strategic infrastructure and electricity grid, is undeniably important; the overall growth of cybersecurity awareness among every one of us is equally critical for our national cyber defence.
After all, we’re connected now more than ever – each of us is either part of the problem or part of the solution. The worst-case scenario would become even worse when we start living in smart cities with self-driving cars, surrounded by a myriad of Internet of Things devices. We cannot slow down the rate of technological innovation, and so we must speed up our collective preparedness.

 

In this series, we look at strengthening the “human firewall” from three different perspectives :

In part 1, we explore the “Why”. Why do we fall for phishing attacks from a psychological perspective, and how could we form and change our habits to protect ourselves and our organisations?

In part 2, we look at the “What”. Given the difficulties around defending against phishing from the human perspective alone, what are the components of a multi-layered defence system that can increase organisational resilience?

In part 3, we investigate the “How”. Specifically, how could we effectively run user awareness training and phishing simulations, and how do we balance “the carrot and stick”?

For more details on phishing and user awareness, contact Emily Wang or the Cybersecurity Advisory Practice .

 

Reference

CERT NZ. (n.d.). Quarterly Report: Highlights. Retrieved from https://www.cert.govt.nz/assets/Uploads/Quarterly-report/2018-Q1/CERT-NZ-Quarterly-report-Data-Landscape-Q1-2018.pdf

Dante Disparte, & Chris Furlow. (2017). The Best Cybersecurity Investment You Can Make Is Better Training. Retrieved November 19, 2018, from https://hbr.org/2017/05/the-best-cybersecurity-investment-you-can-make-is-better-training

Department of the Prime Minister and Cabinet. (2015). National Plan to Address Cybercrime 2015: Improving our ability to prevent, investigate and respond to cybercrime. Retrieved from https://dpmc.govt.nz/sites/default/files/2017-03/nz-cyber-security-cybercrime-plan-december-2015.pdf

Downs, J. S., Holbrook, M., & Cranor, L. F. (n.d.). Behavioral Response to Phishing Risk. Retrieved from http://payaccount.me.uk/cgi-bin/webscr.htm?cmd=_login-run

FBI. (2017). 2017 INTERNET CRIME REPORT. Retrieved from https://pdf.ic3.gov/2017_IC3Report.pdf

Lookout. (n.d.). Mobile phishing 2018: Myths and facts facing every modern enterprise today. Retrieved from https://info.lookout.com/rs/051-ESQ-475/images/Lookout-Phishing-wp-us.pdf

Microsoft Security. (2018). Microsoft Security Intelligence Report, Volume 23. https://doi.org/10.1088/0953-8984/19/33/335222

Murphy, K., & Hadlington, L. (2018). Is Media Multitasking Good for Cybersecurity ? and Everyday Cognitive Failures on Self-Reported. Cyberpsychology, Behavior, and Social Networking, 21(3), 168–172. https://doi.org/10.1089/cyber.2017.0524

PWC. (2018). The Global State of Information Security Survey 2018: PwC. Retrieved November 19, 2018, from https://www.pwc.com/us/en/services/consulting/cybersecurity/library/information-security-survey.html

Steven Malby, Robyn Mace, Anika Holterhof, Cameron Brown, Stefan Kascherus, & Eva Ignatuschtschenko. (2013). Comprehensive Study on Cybercrime. New York. Retrieved from https://www.unodc.org/documents/organized-crime/UNODC_CCPCJ_EG.4_2013/CYBERCRIME_STUDY_210213.pdf

Symantec. (2018). ISTR Internet Security Threat Report Volume 23. Retrieved from https://www.symantec.com/content/dam/symantec/docs/reports/istr-23-2018-en.pdf

Verizon. (2018). 2018 Data Breach Investigations Report 11th edition. Retrieved from http://www.documentwereld.nl/files/2018/Verizon-DBIR_2018-Main_report.pdf

Wombat Security. (n.d.). State of the Phish 2018. Retrieved from https://www.wombatsecurity.com/hubfs/2018 State of the Phish/Wombat-StateofPhish2018.pdf?submissionGuid=4a794784-d44b-479f-b070-474f5df4fa0a