Thursday, January 4, 2024

HOW TO STRENGTHEN YOUR MENTAL DEFENSE AGAINST SOCIAL ENGINEERING?

 A few months back, Uber's internal databases were hacked by a notorious hacking group. Their modus operandi was to gain access to Uber's data through "social engineering". Social engineering attacks use psychological manipulation to deceive people into making security mistakes or giving away sensitive information. This tactic is becoming increasingly common in the world of cybercrime. In this case, the hacker, posing as a corporate IT worker, allegedly convinced an Uber contractor to reveal the password to Uber's systems (there is also a speculation that he got it from the dark web). Post that, the hacker repeatedly sent the login requests. Call it fatigue or a momentary lapse, the contractor eventually accepted the authentication request which led the hacker to gain access.


These types of cyber attacks serve as a reminder that humans continue to be a weakest link in cybersecurity. Cybersecurity teams invest significant resources in educating employees to prevent cyber threats. However, conventional security awareness training programs assume that providing employees with knowledge of the right behaviors will lead to them taking the correct actions. But humans are fallible and their actions often defy this logic.

Why do humans fall prey to cyber attacks (via tactics like social engineering, phishing etc.)?

A while back, I read Daniel Kahneman's book- 'Thinking Fast and Slow' and he reasons that in most day-to-day decision-making scenarios, individuals often opt for the options that appears easiest to them. He calls this type of thinking as "System 1 thinking". Think as if your mind is operating on an auto-pilot, it just accepts automatic decisions. In cybersecurity scenarios, as the genesis of Uber's cyberattack indicated, embracing System 1 thinking leads to ineffective decisions.

How can one avoid falling in System 1 thinking trap?

As Daniel argues in his book- by embracing System 2 thinking. System 2 is the mind's slower, analytical mode, where reason dominates. System 2 thinking can help individuals avoid cyber attacks by promoting more deliberate and analytical thinking when it comes to online security.

As an example, System 1 thinking leads to Frequent Exposure Bias, which is a way to make people believe in falsehoods via frequent repetition. To combat this bias, one can learn to pause before acting and ask:
"Is this the best option or just the option I've been frequently exposed to?"

Isn't it the type of thinking that could help minimize the social engineering attacks? What if your security tools act as a forcing function to help you embrace System 2 thinking?

Do share your thoughts.

(Sharing my sketchnote book summary) hashtagcybersecurity hashtagcyberrisk hashtagsocialengineering

LinkedIn post: https://www.linkedin.com/posts/anujmagazine_cybersecurity-cyberrisk-socialengineering-activity-7024918266108149760-Y7eF/?utm_source=share&utm_medium=member_desktop


No comments: