Cyber Resiliency: Difference between revisions
No edit summary |
|||
Line 18: | Line 18: | ||
===Benchmarking=== | ===Benchmarking=== | ||
Benchmarking refers to the gathering of data from similar organizations for comparison with one's own organization’s cybersecurity measures.<ref>[https://www.logsign.com/blog/what-are-cyber-security-measures-of-effectiveness/ Cybersecurity effectiveness measures, Logsign]</ref> | Benchmarking refers to the gathering of data from similar organizations for comparison with one's own organization’s cybersecurity measures.<ref>[https://www.logsign.com/blog/what-are-cyber-security-measures-of-effectiveness/ Cybersecurity effectiveness measures, Logsign]</ref> They compare their metrics with others, focusing in particular on:<ref>[https://www.darkreading.com/attacks-breaches/cyber-resilience-benchmarks-2020 Cyber Resilience Benchmarks 2020, Dark Reading]</ref> | ||
# Speed: how fast entities can detect a security breach, mobilize a response, and return to business as normal | |||
# Resiliency: the number of systems that were compromised or stopped and for how long | |||
# Accuracy: how well they pinpointed cyber incidents | |||
# Impact: How long attacks last, how much disruption, and how high the costs are for an organization | |||
# Automation: How much reliance on humans/how automated is the detection/stopping of attacks | |||
# Data Privacy Regulation: How many violations and how many fines | |||
# Collaboration: How often and how well does the entity work with law enforcement or other security sectors | |||
==Challenges== | ==Challenges== |