Study of Security Awareness Training
Steve Kruse,Security Principal @ RSA
Bill Pankey, Consultant @Tunitas Group
Paradox
Paradox
Explanations
Conclusions
Innovation Norway Feb 04, 2010http://www.gocsi.com
A
l t d
f i f
t &
it
CSI / FBI Computer Crime Survey
•
Annual study of infosec events & security
practice in US
– Estimates losses from computer crime
– Type and frequency of exploits
– Survey of security practice / response to threats
•
Commonly used baseline for
–
Risk assessments
http://i.cmpnet.com/v2.gocsi.com/pdf/CSISurvey2007.pdf
% f
i
b d
UAT
2007 CSI Finding: Minimal UAT* Budgets
% of security budget spent on UAT
48% 22%
6%
2007 Security budgets average between 3-5% of total IT budget
*UAT: Security Awareness Training
C
ti
i d
US
it
Counter Indications
•
Convention wisdom among US security
practitioners is that end users are the
greatest security threat:
– 85% of security breaches involve end users ~ Forester
– End user web surfing primary source of network infection ~ Symantec
– Email borne compromise of end user accounts has been has been the initial entry vector for 100% of ‘advanced persistent threats’(APT) ~ Mandiant
– ….
I th
i
P
t
i
i l
t
k
The Paradox
“You can’t service pack the end user” - Mike Nash, Microsoft
Is there an inverse Pareto principle at work:
spend the least on the biggest problem?
– High recognition of end user involvement with
security breaches
– Relatively low spending on user security
awareness training awareness training
Is the security industry guilty of ‘risk
management’ malpractice?
Study resolved to understand / explain
this apparent paradox
http://tinyurl.com/UATsurvey
57 ti
2009 Survey of Security Awareness & Training
• 57 questions
– UAT budgeting & rationale
– UAT metrics & accountability
– UAT practices
• Respondents recruited from:
– UseNet / yahoo UAT interest groups
– Metricon audience
• Disclaimer:
– No sampling control
– Suspect bias in self-report survey results:
• More, rather than less interest in UAT than among typical security practitioners
• Natural tendency to over-estimate the quality of their security program
• Survey sample places only slightly greater emphasis on UAT than CSI study
Sample validates CSI Finding
emphasis on UAT than CSI study
– 55% less than 2% of security budget
– 13% greater than 8% of security budget
• Difference are either not significant or indicate a
very slight increased emphasis on UAT
True cost of UAT typically masked:
Security budgets typically are not burdened with the cost of
Security UAT budget in context
Security budgets typically are not burdened with the cost of the users’ time spent on UAT
• Only 1 respondent reported that UAT training time is charged against security budget
Total cost of user training time is likely to be the most costly component of UAT
– 93% of respondent companies require UAT for all93% of respondent companies require UAT for all employees at hire; 75% annual refresher training
Security program’s UAT expenditure is either
very efficient or very wasteful of company
Security UAT budget in context (2)
very efficient or very wasteful of company
(human) resources
– UAT budget includes preparation, delivery and
management costs for a significant expenditure of corporate resources
Normalized UAT management cost metric: Normalized UAT management cost metric:
security program UAT budget / total # of user training hours * loaded labor rate
– Measure of ‘efficiency’ ? or ‘inattention’?
I th
f
l d
t ti
f
ti i
t d
Questions about the rationale for UAT?
Is there formal documentation of anticipated
UAT benefits
? If so, where?
– Security plan: 65%
– Security policy: 42% (pro-forma?)
– UAT business case: 35%
– Individual campaign proposal: 5%
•
But, often there is no accountability
for benefit statements:
– 60% of respondents report no management review or approval of the above
Wh t i
d t j
tif th
it
t f
UAT Rationale
What is used to justify the commitment of
users’ time to UAT
Regulatory requirement: 73%
– Unspecified ‘security benefit’: 65%
– External Auditor report: 45%
Expected increase in user accountability: 15%
– Expected increase in user accountability: 15%
Basis for rejection of requests for increase in
mandated UAT (100 % of respondents)?
– Unspecified management priorities: 70%
– Weak business case: 16%
Few companies tracked meaningful
Lack of UAT Effectiveness | Performance Measurement
p
g
measures of UAT performance
– Training completion / compliance rate: 100%
– (User) Behavioral \ attitude measures: 13%
– Correlation w/ security incident metrics: 7%
Management satisfaction determined by:
– CSO \ CISO: 67%
– CIO: 25%
60% l i
f UAT t
Illusory UAT Management
“you can’t manage what you don’t measure” Drucker
60% claim success of UAT to
– Reduce security incidents
– Address root causes of security breaches
– Increase compliance with security policy
While focusing on cost
While focusing on cost,
– Rate of training completion
But Avoiding collection of UAT performance /
effectiveness metrics
Focus on activity rather than benefit realization.
Few Meaningful UAT Metrics
y
– Security Metrics, Jacquith
• % of staff completing security awareness training? (6)
Correlation of tailgating rates w/ training latency
– Metrics for IT Service Management, ITSM
• % of staff not at optimal training level? (10)
Poll of callers to service desk Poll of callers to service desk
– SP800-55 Performance Measurement Guide for
Information Security, NIST
• % of employees in security roles receiving specialized security training (1)
Common UAT metrics available to establish industry
Do UAT metrics obscure the security objective?
baselines may ‘miss the point’
Implicitly assume the effectiveness of
training, ie, the results of management
– W/o the relevance, credibility & appropriateness of the training, the ‘completion rate’ indicates nothing about the security value of the UAT program.
Currently these measures are primarily cost
metrics reflecting the scale of resource (end
user time) consumption
Security role of the end user is specific to company and its security strategy Is the user a:
Meaningful UAT Metrics imply Strategy
its security strategy. Is the user, a:
– Threat to be engineered around?
– An actor whose behavior needs the constraint of policy? – A source of detective or corrective control?
– All of the above? Some of the above?
The view of user determines the objectives of UAT j
(improve user performance in security role)
– What and how much risk awareness – Will vary with industry and company culture
Effectiveness of UAT has to be measured
against expectations of security strategy
Standardization of Security Awareness
against expectations of security strategy
– Some user roles are expected to have more than
less general security / risk awareness
– Some use roles expected to take specific action
in response to events
Standardization of user expectation facilitates
development of appropriate metrics and
establishing meaningful industry baselines
• Blissfully unaware
– Little recognition or acceptance of most information security threats At thi l l l t i i th t i f ti it i t f IT t
Maturity Model for User Security Awareness
– At this level, prevalent view is that information security is a property of IT systems and largely a matter of architecture and configuration. Security largely independnet of user behavior.
• Consciously incompetent
– Some recognition that there is a information security threat, but: • Poor risk assessment skill and intuition
• Uncertain of action needed to protect company information assets. Will do nothing rather than create further harm
• Compliant
– Aware of risks identified in company policy – Aware of risks identified in company policy
• Will take action identified in company security policy • Risk aware
– Considers information security risk in performance of company duties, but • Unsure of appropriate action; sometime will report incidents
• Competent & Practiced
Blissfully unaware
– User: ‘heads down’ routine data processing roles
UAT Goals re Targeted Maturity Level
– No UAT?
Complaint
– User: Industry w/ little discretionary access control (e.g. banking). Users ‘locked down’ by restrictive policy.
– UAT: Policy training
Metric: Correlate policy violations with training latency
– Metric: Correlate policy violations with training latency
Risk aware
– User: Industry w/ significant discretionary access (e.g. health)
– UAT: Policy training + Threat identification
– Metric: # of anomalies reported by users
Scenario:
End user sees what could be a company owned
Example
End user sees what could be a company owned laptop in an unlocked car in the facility parking lot. What is the end user expected to do?
• Blissfully unaware: Where the company security tolerates
the ‘unaware user’, e.g. where whole disk encryption has been implemented for all company laptops ~ p p y p p nothingg
• Compliant: Where company policy prescribes all security
obligations ~ only what is described in policy
• Risk aware: Where company security model depends upon
Little accountability for UAT beyond compliance with regulatory mandates
Conclusions
regulatory mandates
– Few, if any performance metrics
– Focus on cost
Where there is no accountability, the optimal strategy is to reduce the absolute UAT expenditure
UAT management requires new UAT performance metrics
– Correlate UAT with specific security benefits
Lack of industry UAT performance benchmarks