Securing our online presence: the threats and some solutions.

tedwellstood
Global Intersection
10 min readAug 10, 2016

Part Two: Defensive solutions — Staff Training.

The defence of IT systems falls into three distinct areas; Technical, Process, Personnel. Each of these areas should be given equal weighting but seldom are. Technical solutions are by far the most relied upon to protect any system. Processes are next in line with businesses producing policies, procedures etc, but with little enforcement behind them. Lastly, personnel, to my mind this is the most important facet of defence and the one most likely to be ignored by individuals and business alike. I will deal with the technical and process areas in subsequent articles, this piece is dedicated to my biggest gripe; staff training.

There are many ways to break into an IT system, but the majority of these pathways require a good deal of technical knowledge to be successful and remain undetected. However, rather than waste precious time and trouble breaking through multiple layers of very technical devices undetected (Firewalls, NIPS etc.); the threat actors will use far simpler techniques. A very commonly used method is to trick internal staff into installing malicious software on the actor’s behalf, thus bypassing the security technology. This particular method of attack, known as phishing, falls within the scope of Social Engineering.

Social engineering can be defined as the art of gaining access to buildings, systems or data by exploiting human psychology, rather than breaking in or using technical hacking techniques (1). Social engineering is not a new technique; it has been used for generations. In earlier times these people were known as confidence tricksters or con artists and make no mistake, they are truly malicious people (2). As social engineering is an attack on the human psyche it is not an IT department problem to solve, although they do end up fixing issues caused by these attacks.

All staff are vulnerable to social engineering be they the newest admin assistant or the CEO (and everyone in-between); this includes IT staff. “If you want to hack a corporation fast, social engineering techniques work every time, and more often than not it works the first time” (3).

The social engineer will take advantage of normal human behaviour to manipulate their target:

Curiosity: During a study, researchers at the University of Illinois dropped 300 USB drives around the University of Michigan campus. Of these drives 98% were moved, 45% were connected to network computers, one was connected only 6 minutes after it was dropped. All the drives were labelled “confidential” or Final Exam Solutions” or attached to house keys. These drives all contained a ‘virus’, which informed them they were part of a study and asked them to take a survey, luckily in this case the drives contained nothing malicious (4). However, in 2008 a US soldier found a USB drive in a parking area outside a US base, one of many left by a FIS. This soldier plugged the drive in to the US military’s central command network, a worm was uploaded that scanned computers for data, created backdoors and linked to external malicious servers. It took 14 months for the Pentagon to clean their system of this worm (5).

Carelessness, Arrogance: It’s not unusual for those that think they’re invulnerable to become the biggest security risk. A CEO hired a security specialist to break into his systems, he told the specialist not to bother with him as he guarded his secrets with this life. Needless to say the CEO succumbed to the specialist without even realising he’d been hacked, via social media and a charity he’d previously dealt with. When informed of the success the CEO complained that they used unfair tactics. The specialist merely responded by saying hackers are malicious people and wouldn’t think twice about using such methods (6)

Authoritative: Humans on the whole don’t like confrontation. When an angry CEO rings up the help desk demanding their password be reset because they can’t access their account and need immediate access for a board meeting, the help desk will normally cede very quickly. The social engineer, pretending to be that CEO, now has access to all the data and systems of the CEO. Access rights can be elevated, backdoors can be set, viruses injected, indeed any malicious intent can now be enacted in all but a few seconds.

Persuasion, influence and the need to be helpful: A professional security penetration tester bought a Cisco shirt from a charity store, this helped convince a target receptionist that he was there on a technical support visit. He was allowed into the building without further checks. At the same time a colleague was smoking outside the building in an area used by their staff. He just needed to wait for a staff member to come out for a smoke, strike up a conversation and follow them back in to the building, tailgating them through the unsecure door. Together they left infected CDs and USB drives in places they knew would be found and stole information from vacant desks. At no time where they challenged inside the building (7).

Fear, flattery, greed, timing (and many more): Phishing will use all these techniques and more to gain access to a company’s system. Emails that contain messages telling recipients that their bank account has been hacked and to click on the link to reset their access, or a message telling them they’ve won a lottery and click on a link, or they’ve been selected to take part in a survey of successful businesses and click on a link. In fact, any ruse that encourages a user to click on a link will be used. When phishing is used against specific targets it’s known as spear phishing, when used against ‘C’ level executives it’s known as whaling, the techniques however, remain the same. The Verizon 2015 data breach investigation report states 82 seconds is all it takes from releasing a new phishing attack until the first user falls victim and 50% of targets will click on the link within the first hour (8).

STAFF TRAINING

Technology can only go so far in the protection of IT systems, ultimately it is the human at the end of a network who is the weakest link. Indeed, people have been the weak link in virtually all of the penetrations on the US military network, with technology actually creating a false sense of security (9). A study in the US has shown that the implementation of technology doesn’t actually lower the number if incidents and that the focus needs to be on staff, “security is a people issue” (10). As all users are at the centre of any IT system, with training they can act as the systems ‘early warning sensors’, reporting suspicions or admitting to [perceived] mistakes (11). These human sensors are the most effective method of detecting phishing attacks, more so than almost any technology (12).

In 2015 1% of employees were responsible for 75% of security issues on enterprise systems (13). Thus by giving users the correct training they will have the knowledge and understanding necessary to recognise when something is wrong and to trust their internal alarm bells (14). Analysis has shown that by changing user behaviours, though staff security awareness training, security related risks can be reduced by approximately 60% (15).

“Staff education is one of the simplest and most effective security investments that can be made. To ignore this is tantamount to reckless operating and negligence”. (16)

Internal staff are the biggest risk to any organisation, inadvertently or by choice. All staff within any business that uses a computer network need some basic security training; for example an annual mandatory online security awareness course. This course should cover their personal online presence, their habits, the threats, social engineering, general IT security, the companies security policy and the consequences of noncompliance. An understanding of the current and emergent threats, and how they may be used, would help in securing the culture change necessary to decrease or even prevent the number of internal security breaches. Ideally, this training should be linked to system access thus no access to company systems and files is granted until the training has been successfully completed.

Technical IT engineers are but one line of defence for corporate systems. Training in trade silos (Networks, servers, applications) is essential to gain the depth of knowledge required to support each particular technical field. However, security threats often cross specialist boundaries, so in addition to their specific training, all technical staff should complete a standardised technical level of security training, regardless of specialization. This technical security training is in addition to the staff security awareness training not in lieu of it.

Staff awareness training needs to encompass not only the threats that exist but in changing the culture of an organisation to one where security is a normal part of life. Unless the users have the desire and motivation to protect their assets any training will fail to have the necessary effect (17). It is critical that this training and culture change be driven from the very top of the organisation. Without the Board and ‘C’ level executives taking an active role this training will fail (18). Rather than thinking they are above such training the executives need to be the model for good behaviour and live by the very principles they expect of their staff. Management must also trust and stand by their staff. If a staff member makes a genuine mistake they should not be penalised for it (unless of course they are habitual offenders), rather, let others know what happened so the issue isn’t repeated. Staff will not report or help prevent security incidents if they live in fear of management reprisals. In the same vain, should a staff member recognise a potential threat and report it before it’s an issue they should be rewarded for their diligence.

The culture change will be a challenge for most people, as it is new and will likely be treated with apprehension; as such care on its design and delivery needs to be taken. The message should be made personal to them, they are protecting their family, their kids and their home-life, then the organisation. The use of technical jargon needs to be avoided, stick to plain English, use repetition to get the message across and phrases to take away, for example SCAM (19):

Suspect; uninvited requests

Challenge; everything you suspect

Authenticate; check the originator is genuine

Manage; any incidents or breaches effectively

The training should not be just a series of lectures or briefings, make it interactive, cement the knowledge through actions. Teaching skills and knowledge through actions is not a new technique and has been proven over centuries; this idea was coined around 450BC by Confucius “I hear and I forget. I see and I remember. I do and I understand”. This training must be delivered at a minimum annually, to every member of staff and it needs to be updated regularly to reflect changes in the threat landscape. There should also be impromptu checks to make sure the training is becoming a part of culture; for instance, using crafted phishing emails from security staff, or trying to physically break into premises with false credentials or persuasion.

It has been shown that with the appropriate change in security culture and training, end users will have the knowledge, understanding and the suspicion necessary to detect and prevent potential threats; they in effect become Human Firewalls. Without this training the use of social engineering techniques will continue to rise and cause more and more issues for individuals, governments and business alike.

Footnotes:

(1) CSOonline. (2012). The Ultimate Guide to Social Engineering. Retrieved from http://www.csoonline.com/article/2130996/identity-access/cso-s-ultimate-guide-to-social-engineering.htm?nsdr=true

(2) Radcliffe, J. (2015). How to excite your staff about social engineering and awareness. Retrieved from: http://www.brighttalk.com/webcast/11399/178053

(3) Heary, J. (2009). Top 5 Social Engineering Exploit Techniques. Retrieved from: http://www.pcworld.com/article/182180/top_5_social_engineering_exploit_techniques.html

(4) Marotti, A. (2016). Don’t use that USB drive you found. Retrieved from: http://www.stuff.co.nz/technology/digital-living/79173568/don’t-use-that-usb-drive-you-found

(5) Singer, P.W. and Friedman, A. 2014, Cybersecurity and Cyberwarfare, What Everyone Needs to Know, Oxford University Press, New York.

(6) CSOonline. (2012). The Ultimate Guide to Social Engineering. Retrieved from http://www.csoonline.com/article/2130996/identity-access/cso-s-ultimate-guide-to-social-engineering.htm?nsdr=true

(7) Goodchild, J. (2012). Social Engineering: The Basics. Retrieved from: http://www.csoonline.com/article/2124681/leadership-management/security-awareness-social-engineering-the-basics.html

(8) Verizon. (2015). 2015 DATA BREACH INVESTIGATIONS REPORT. Retrieved from: http://www.verizon.com

(9) Winnefeld Jr., J.A. Kirchhoff, C. and Upton, D.M. (2015). Cybersecruity’s human factor: Lessons from the Pentagon. Harvard Business Review. HBR Reprint R1509G, 87–95.

(10) White, G. L. (2015). EDUCATION AND PREVENTION RELATIONSHIPS ON SECURITY INCIDENTS FOR HOME COMPUTERS. The Journal of Computer Information Systems, 55(3), 29–37.

(11) Rashid, S. Chaudhary, Z. (2015). Data breach, your biggest nightmare. Retrieved from: http://www.brighttalk.com/webcast/11811/176457

(12) Verizon. (2015). 2015 DATA BREACH INVESTIGATIONS REPORT. Retrieved from: http://www.verizon.com

(13) Looking Glass Cyber Solutions. (2016). Information Security Threat Landscape: Recent Trends and 2016 Outlook. Retrieved from: http://info.cyveillance.com/rs/cyveillanceinc/images/CYV-WP-LandscapeInformationSecurity.pdf

(14) Winnefeld Jr., J.A. Kirchhoff, C. and Upton, D.M. (2015). Cybersecruity’s human factor: Lessons from the Pentagon. Harvard Business Review. HBR Reprint R1509G, 87–95.

(15) Brink, D.E. (2014b). The last mile in IT security: changing user behaviours. Aberdeen Group.

(16) Rashid, S. Chaudhary, Z. (2015). Data breach, your biggest nightmare. Retrieved from: http://www.brighttalk.com/webcast/11811/176457

(17) Shropshire, J. D., Warkentin, M., and Johnston, A. C. (2010). IMPACT OF NEGATIVE MESSAGE FRAMING ON SECURITY ADOPTION. The Journal of Computer Information Systems, 51(1), 41–51.

(18) Parrish, A. Harris, D. (2015). Building a human firewall starts with security training. Retrieved from: http://www.brighttalk.com/webcast/9063/155191

(19) Radcliffe, J. (2015). How to excite your staff about social engineering and awareness. Retrieved from: http://www.brighttalk.com/webcast/11399/178053

--

--