Phishing Trilogy Part 3: The “Carrot and Stick” Approach

What’s the best way to fight phishing attacks? Is it punishing users or rewarding good behaviour?

By Emily Wang

This is part Three of the Phishing Trilogy, see the series introduction here:

Part 1 – From awareness to habits

Part 2 – A multi -layered defence

The ‘carrot and stick’ approach

People often scoff at phishing attack victims and put the blame on them. It needs to be recognised that this “blame culture” contributes to the real issue of slow reporting of phishing compromises which has a direct and material effect on organisations.

Studies collectively show, falling for phishing email is far from rare and the number of victims is growing. The real question is how to mitigate it? This article covers the discussion around the “carrot and stick” approach. They are not mutually exclusive and are most effective when used together to best suit your business.

Carrot

The consensus in the awareness training domain is not to blame the users. We should encourage them to report any suspicious activities, particularly if they are the originators of the breach.

Since a hacker only needs one person out of the whole organisation to click on a single malicious link, it is impractical to achieve zero click rate. However, if we have one person that reports the incident, it allows the security and the IT team to review and quickly stop the phishing campaign from spreading and causing further damage.

The Cyber Security Breaches Survey published by the UK government (Department for Digital, Culture, Media and Sport, 2019) found that the most disrupting attacks were more likely to be spotted by employees than by software, which is the case for 63% of businesses. This also aligns with previous years findings. Hence, we should realise the importance of staff vigilance and to understand the power of empowering employees.

Stick

Another school of thought is to enforce punishment when people repeatedly fall for phishing attacks. For example, Paul Beckman, CISO at the Department of Homeland Security considered a policy to remove employees’ clearance if they repeatedly fail an anti-phishing test. Needless to say, this is a controversial idea and received a lot of criticism. One study showed that the perceived severity of consequences did not predict behaviour (Downs, Holbrook, & Cranor, n.d.).

Studies also show that training focused on prohibition of behaviour or attitudes can often have the opposite effect whereas training that emphasises positive effects can and do change behaviour (Robinson, 2011).

What is your mix?

This table outlines the differences between the two approaches. It is essential to understand your business to pick the right mix.

Be mindful about leaning too heavily on the “stick” approach. The ripple effects can put a strain on employees’ morale, leading to a sense of anxiety and distrust. In the worst case, it can lead to grudge attacks. Reports show that internal threats in cybersecurity are prevalent and cause more grave damage than external attacks (Tripwire, 2017).

It is our advice to develop an approach that balances the carrot and the stick. Taking into account the responsibility of the role and its importance in your organisation will help you to determine the appropriate balance. For example, an IT admin would be expected to be much more vigilant to phishing than a clerk our your logistics desk. It may well be appropriate for the IT admin as part of their employment agreement to agree to a policy where there is a sliding scale of consequence for phishing breaches, whereas that would not be appropriate for the clerk.

Food for thought

Regardless of what stance you take on the approaches. It is important to consider the following:

– Ask your HR, legal and management to contribute

  • What are the legal or contractual requirements?
  • What is the company’s policy on rewards and penalties?
  • What culture is the company trying to build?

– Be consistent with your approach

  • For example, if enforcement is going to be implemented, senior management need to follow the policy as well. They need to be role models

– Understand that people make mistakes and don’t blindly blame your staff

  • As discussed, aiming for zero click-rate is unreasonable. Therefore, we need to acknowledge honest mistakes can happen.

– Ensure that you have an incident-handling process in place. For example, who/how to report them.

  • Your staff needs to know the proper process to be compliant with the company’s policies

For more details on phishing and user awareness, contact Emily Wang or the Cybersecurity Advisory Practice .

References

Department for Digital, Culture, Media and Sport, T. (2019). Cyber Security Breaches Survey 2019. London. Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/791940/Cyber_Security_Breaches_Survey_2019_-_Main_Report.PDF

Downs, J. S., Holbrook, M., & Cranor, L. F. (n.d.). Behavioral Response to Phishing Risk. Retrieved from http://payaccount.me.uk/cgi-bin/webscr.htm?cmd=_login-run

Robinson, L. (2011). How the Science of Behavior Change Can Help Environmentalists. Retrieved from https://www.triplepundit.com/story/2011/how-science-behavior-change-can-help-environmentalists/81401

Tripwire. (2017). Insider Threats as the Main Security Threat in 2017. Retrieved November 19, 2018, from https://www.tripwire.com/state-of-security/security-data-protection/insider-threats-main-security-threat-2017/

Phishing Trilogy Part 2: A multi-layered defence

By Emily Wang

This is Part Two of the Phishing Trilogy, read Part One here

We can see how modifying habits can help to combat phishing attacks from the part 1 of this trilogy: “From awareness to habits”. However, it is unrealistic to expect no-one to click on a malicious link by only changing people’s email behaviour. In fact, some argue that a “Zero Click” goal is harmful (Spitzner, 2017). It doesn’t matter how much training is provided; people will make mistakes.

This is evident from many of our phishing simulation reports, where a few people would ignore the education page after they fell for a simulated phishing email. They realised their mistake as soon as they clicked on the link and would immediately close whatever popped up as a reflex act. This doesn’t in itself show that awareness training is futile; like many other defensive tools, awareness training should be used to reduce risk even though it is not possible to completely eradicate it.

The three pillars

Let us not forget about the three pillars of cybersecurity, namely people, process and technology. Using them together is like building a 3-legged stool. If any of the legs are too short, it will cause an imbalance.

Google recently announced that none of their 85,000+ employees have been phished since early 2017 (Krebs, 2018). What is their secret? Google requires all staff to use security keys to log in. This security key is an inexpensive USB-based device that adds to the two-factor authentication. That is, the user logs in with something they know (their password) and something they have (their security key). This is called “2-factor authentication”. It is a perfect example for aiding a person with technology and process measures, or as the security experts like to call it – defence in depth.

A multi-layered approach

The guidance splits the mitigations into four layers:

  • Layer 1: Make it difficult for attackers to reach your users
  • Layer 2: Help users identify and report suspected phishing emails
  • Layer 3: Protect your organisation from the effects of undetected phishing emails
  • Layer 4: Respond quickly to incidents

Take layer 1 as an example, here is how we can defend ourselves from all three angles:

Many controls can be placed into your organisation at different layers. To holistically implement counter-measurements, we need to consider your organisation’s constraint and what is suitable for your employees. At Datacom, we look at how to help customers reduce risks from all six areas. Importantly though:

Don’t wait until it’s too late and don’t rely on just one defence mechanism.

For more details on phishing and user awareness, contact Emily Wang or the Cybersecurity Advisory Practice .

References

Krebs, B. (2018). Google: Security Keys Neutralized Employee Phishing. Retrieved from https://krebsonsecurity.com/2018/07/google-security-keys-neutralized-employee-phishing/

National Cyber Security Centre. (2018). Phishing attacks: defending your organisation. Retrieved from https://www.ncsc.gov.uk/phishing

Spitzner, L. (2017). Why a Phishing Click Rate of 0% is Bad | SANS Security Awareness. Retrieved November 18, 2018, from https://www.sans.org/security-awareness-training/blog/why-phishing-click-rate-0-bad