The Latest in IT Security

Steps to Improving Security Efficacy

28
May
2014

Security executives are charged with improving security monitoring, analytics, operations, and business enablement across the enterprise. One of the biggest challenges they face is achieving more efficacy, especially in light of the recent uptick in data breaches and their ability to circumvent existing security controls. So, what can be done?

Typically, security effectiveness is measured by an organization’s ability to prevent known and unknown threats. To prevent a threat from exploiting a vulnerability that would generate a security incident, organizations most commonly rely on the implementation of security controls in combination with security tools. However, as the recent Target data breach illustrated, this is no longer an effective or scalable approach. According to media reports, Target not only had common security controls in place, but also sophisticated security tools that detected the initial data breach in near real time. The problem was that the data from their best of breed tools was not processed and correlated in time to prevent one of the largest third-party originated data breaches in cyber history.

Unfortunately, the Target scenario is not uncommon. Many organizations possess best of breed tools, but continue to rely on manual processes to comb through mountains of logs. It is no wonder that critical issues are not addressed in a timely fashion. According to the Verizon 2013 Data Breach Investigations Report, 69% of breaches were discovered by a third-party and not through internal resources. The scale of security data that needs analysis has simply become too big and too complex to manage. It takes months, and even years, to piece together an actionable picture.

In order to find the needle in the haystack, it is imperative to have all necessary data available to diagnose the patterns that point to an advanced persistent threat or sophisticated cyber-attack. Big data sets can assist in putting specific behavior into context, but there are some real technological challenges to overcome.

According to Gartner (see Information Security Is Becoming a Big Data Analytics Problem, written by Neil MacDonald) “the amount of data required for information security to effectively detect advanced attacks and, at the same time, support new business initiatives will grow rapidly over the next five years.” Gartner adds, “the amount of data analyzed by enterprise information security organizations will double every year through 2016. By 2016, 40% of enterprises will actively analyze at least 10 terabytes of data for information security intelligence, up from less than 3% in 2011.”

Thus, despite the challenges related to big data in security, a first step in improving security efficacy is coverage. In the past, many organizations relied on sampling and point-in-time assessments to validate the existence and effectiveness of controls. Given the velocity and complexity of security threats in today’s environment, this style of control assurance is outdated. In order to improve security processes, continuous collection and diagnostics of relevant data to test the efficacy of controls is necessary. This is why we are seeing the introduction of enhanced regulations (e.g., PCI DSS 3.0, NIST SP 800-137) that prescribe continuous diagnostics of security controls.

In addition to the frequency of control assessments, organizations need to extend their coverage beyond stopping threats from getting in to include prevention of data going out. At the end of the day, the real harm of a cyber-attack lies in the data extrusion or leakage and not in the exploitation of a vulnerability.

Last, but not least, coverage needs to be extended beyond the internal network and endpoints to include dispersed web properties, social media channels, mobile platforms, and third-party infrastructures. While this increases big data challenges in security, cybercrime is not single-dimensional and requires an integrated cyber security architecture that covers networks, endpoints, and security analytics on an enterprise scale.

This preventive, pro-active model is based on interconnecting otherwise silo-based security and IT tools and continuously monitoring and assessing the data they generate. The objective is to achieve a closed-loop, automated remediation process that is based on risk.

Continuous security monitoring includes the reconciliation of assets and automation of data classification, alignment of technical controls, automation of compliance testing, deployment of assessment surveys, and automation of data consolidation. By leveraging a common control framework an enterprise can reduce overlap, increase accuracy in data collection and data analysis, and reduce redundant, labor-intensive efforts by up to 75%.

This approach allows for an increased frequency of data assessments (e.g., on a weekly basis) and requires security data automation by aggregating and normalizing data from a variety of sources such as security information and event management (SIEM), asset management, threat feeds, and vulnerability scanners. The benefits are situational awareness which can expose exploits and threats in a timely manner (preventing a Target scenario), and historic trend data to assist in predictive security. Ultimately, this model can significantly improve an organization’s security efficacy.

Tweet

Torsten George is Vice President of Worldwide Marketing and Products at integrated risk management software vendor Agiliance. With over 20 years of global information security experience, Torsten frequently presents and provides commentary on compliance and security risk management strategies, data breaches, cyber security, and incident response best practices. Torsten has held executive roles with ActivIdentity (now part of HID Global), Digital Link, and Everdream Corporation (now part of Dell). He holds a Ph.D. in Economics and an M.B.A.Previous Columns by Torsten George:Steps to Improving Security EfficacyRetooling Security for the Cloud AgeSecurity Threats: Risks Often Neglected Step Child The NIST Cybersecurity Framework – Improving Cyber Resilience? The Role of Big Data in Security

sponsored links

Tags: INDUSTRY INSIGHTS

Network Security

Incident Management

Security Infrastructure

Management Strategy

Comments are closed.

Categories

THURSDAY, APRIL 25, 2024
WHITE PAPERS

Mission-Critical Broadband – Why Governments Should Partner with Commercial Operators:
Many governments embrace mobile network operator (MNO) networks as ...

ARA at Scale: How to Choose a Solution That Grows With Your Needs:
Application release automation (ARA) tools enable best practices in...

The Multi-Model Database:
Part of the “new normal” where data and cloud applications are ...

Featured

Archives

Latest Comments