Documente Academic
Documente Profesional
Documente Cultură
Submitted to Submitted by
Brig Dr. Imran Rashid Hina Mukhtar
Sehrish Sajid
Nabeela Bibi
1. If user forget to add security patches to a software or add security patches to the wrong
software. [1]
2. If users are fail to upgrade system resources, for example, software/hardware resources
are not capable of maintaining proper security level.[1]
3. If a user mistakenly elevate the privileges. Elevation of privilege mean to upgrade the
level or state of user/ resource from which is assigned. [2]
4. In an organization, if there is communication gap between employees of organization. for
example, a network analysts forget to share vulnerability scan lists to the network
administrators or security specialist.[3]
5. If user forget to run antivirus or vulnerability scanner .[4]
6. Accidental disclosure (e.g., via the internet) for example, sensitive information posted
publicly on a website, mishandled, or sent to the wrong party via email, fax, or mail.
7. Clicking on malicious Link such as code (UIT-HACKing, malware/spyware, an
outsider’s electronic entry acquired through social engineering (e.g., phishing email
attack, planted or unauthorized USB drive) and carried out via software, such as malware
and spyware.[9]
8. Inadequate procedures or directions, poor communication for Data flow.
9. Insufficient Working settings can also cause a serious threat because of ddistractions,
insufficient resources, poor management systems, inadequate security practices.
10. Improper or accidental disposal of physical records or lost, discarded, or stolen
nonelectronic records, such as paper documents
11. Passwords used by the users are not strong enough [22]
12. Opening organizational provided accounts from outside the organization [23].
13. Portable equipment no longer in possession of employees, lost, discarded, or stolen data
storage device, such as a laptop, PDA, smart phone, portable memory device, CD, hard
drive, or data tape can cause serious threat.
14. The systems or procedures are not compliant with the organization as employees ignore
to follow or they do not consider the policy seriously.
15. Ambiguities are not cleared by the employees regarding security concerns and they took
it in a unserious way.
16. Peripheral device Hardware Trojan represent real and significant risks, therefore if user
connects USB device of unknown vendor or person with a system, system can be affected
[10].
17. Individual’s friendliness, impersonation, conformity, decoying, and sympathy due to
which he trust relationships and leak the information.
18. Users tend to not pay attention to the source, grammar, and spelling in a phishing email,
instead focusing disproportionately on urgency cues [11].
19. Users may miss cues in the address and status bars of emails [12][13].
20. High cognitive load (high subjective mental workload) by the managers or organization
can cause narrowing of attention. Workplace stressors (e.g., organization-imposed time
pressures) contributing to higher levels of subjective mental workload tend to negatively
impact human performance by, for example, narrowing visual attention such that
important cues attributed to malicious activity may be missed and by reducing cognitive
resources needed for effective job performance [14].
21. Lack of knowledge, memory failure, or faulty judgment or risk perception are potential
factors in unintentional threats risk. For example, knowledge or memory deficits may
underlie the inability to recognize design inconsistencies that distinguish real and fake
error messages[15].
22. Users ignore the organization’s warning notices against phishing attempts[16].
23. Human decisions tend to be biased and are not purely logical, and an individual may
devote insufficient cognitive resources for correct reasoning and judgment. An example
of decision making bias occurs when individuals tend to think that threats are highly
unlikely (e.g., they underestimate the abilities of social engineering attackers and
overestimate the defensive capabilities of organizational security systems), and
consequently ignore such threats [17].
24. Annoyance with popup messages may lead users to click on fake popups [18]. Users may
also lack awareness of potential risks involved in clicking fake popups.
25. Unintentional distribution and/or transportation using network access [30]
Disclosure of information by accidentally using reply-to-all on a mailing list
Unintentional publishing of business confidential information on a new project by a
trusted machine builder
26. Unintentional use of information system resulting in errors [30]
Inaccurate data entry, resulting in errors in financial systems
27. Use of authorized network access to accidentally install malicious software
Install a virus on a server in the network using local admin right [30]
28. Unintentional use of unauthorized network access
Sharing passwords with fellow insiders as a solution for business continuity during
vacations
Creating workarounds for non supported system actions
Accidentally acquiring information that was left unattended by an insider (i.e. USB
stick, documents or on screen) [30]
29. Data copied to insecure device [31]
30. malicious code” – sensitive information social engineered (e.g., planted USB drive,
phishing attack) in combination with malware or spyware.[31]
23. Implement security best practices throughout the organization, as defined by widely
24. individuals will be deterred from performing undesirable behavior (e.g. crime,
computer abuse, policy violation) if they perceive that there will be punishments or
[9] Greitzer FL, Strozer J, Cohen S, Bergey J, Cowley J, Moore A, Mundie D. Unintentional
insider threat: contributing factors, observables, and mitigation strategies. In2014 47th Hawaii
International Conference on System Sciences 2014 Jan 6 (pp. 2025-2034). IEEE.
[10] Clark J, Leblanc S, Knight S. Risks associated with usb hardware trojan devices used by
insiders. In2011 IEEE International Systems Conference 2011 Apr 4 (pp. 201-208). IEEE.
[11] Vishwanath, A., Herath, T., Chen, R., Wang, J. and Rao, H.R., 2011. Why do people get
phished? Testing individual differences in phishing vulnerability within an integrated,
information processing model. Decision Support Systems, 51(3), pp.576-586.
[12] Erkkila, J., 2011, May. Why we fall for phishing. In Proceedings of the SIGCHI conference
on Human Factors in Computing Systems CHI 2011 (pp. 7-12). ACM.
[13] Dhamija, R., Tygar, J.D. and Hearst, M., 2006, April. Why phishing works. In Proceedings
of the SIGCHI conference on Human Factors in computing systems (pp. 581-590). ACM.
[14] Sharek, D., Swofford, C., & Wogalter, M. “Failure to Recognize Fake Internet Popup
Warning Messages.” Proceedings of the Human Factors and Ergonomics Society 52nd Annual
Meeting, 2008, pp. 557-560.
[15] Downs, JS, MB Holbrook, & LF Cranor. “Decision Strategies and Susceptibility to
Phishing.” Symposium On Usable Privacy and Security (SOUPS), July 12-14, 2006, Pittsburgh,
PA, USA
[16] Mohebzada, J. G., El Zarka, A., Bhojani, A. H., & Darwish, A. “Phishing in a University
Community.” International Conference on Innovations in Information Technology (IIT), 2012,
249-254
[17] Sandouka, H. Cullen, A.J., & Mann, I. “Social Engineering Detection Using Neural
Networks.” In IEEE International Conference on CyberWorlds (CW’09). 2009, 273-278.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5279574
[18] Sharek, D., Swofford, C., & Wogalter, M. “Failure to Recognize Fake Internet Popup
Warning Messages.” Proceedings of the Human Factors and Ergonomics Society 52nd Annual
Meeting, 2008, pp. 557-560.
[20] Greitzer, F.L., Strozer, J.R., Cohen, S., Moore, A.P., Mundie, D. and Cowley, J., 2014,
May. Analysis of unintentional insider threats deriving from social engineering exploits. In 2014
IEEE Security and Privacy Workshops (pp. 236-250). IEEE.
[21] Colwill, C., 2009. Human factors in information security: The insider threat–Who can you
trust these days?. Information security technical report, 14(4), pp.186-196.
[22] Warkentin, M. and Willison, R., 2009. Behavioral and policy issues in information systems
security: the insider threat. European Journal of Information Systems, 18(2), pp.101-105.
[23] Kagal L, Finin T, Joshi A. A policy based approach to security for the semantic web.
InInternational semantic web conference 2003 Oct 20 (pp. 402-418). Springer, Berlin,
Heidelberg.
[24] Lim, K., NEXTLABS Inc, 2017. Protecting Information Using Policies and Encryption.
U.S. Patent Application 15/421,358.
[26] Agrafiotis, I., Erola, A., Goldsmith, M. and Creese, S., 2017. Formalising Policies for
Insider-threat Detection: A Tripwire Grammar. JoWUA, 8(1), pp.26-43.
[30] W. (Wesley) Cornelissen MSc , Dr. Ir. A.A.M. (Ton) Spil , Ir. V. (Virginia) Nunes
Leal Franqueira , May 6, 2009 . Investigating Insider Threats: Problems and Solutions
[31] Insight into Insiders and IT: A Survey of Insider Threat Taxonomies, Analysis,
SUTD