Every business wants more data. Data on their customers, competition, operations, processes, employees, inventory and more. Data can be used to make better-informed business decisions and provide strategic insights that give your company a competitive advantage in terms of efficiencies, enhancing the customer experience, or refining market strategy. Its uses are limitless. Over the last decade, computing power has advanced to the point where generating and storing massive amounts of data has become highly cost-efficient.
Amassing business data is similar to a dog successfully chasing a car – now that we’ve caught it, what do we do with it? With all that data available, most businesses struggle to figure out how to take advantage of it. According to Forrester, up to 73% of data within an enterprise goes unused for analytics. We are so used to extracting targeted information from data that we simply ignore what we don’t understand and throw it away as noise. This problem is prevalent in every industry, but especially in the security world. Security teams are overwhelmed with the vast amounts of data generated from firewalls, intrusion detection systems, network appliances and other devices. It’s impossible to expect security teams to interpret all this data. We unintentionally end up focusing on what we already know how to analyze and ignoring what we don’t.
Typical alerting systems are configured to raise alarms, but only when they encounter a defined binary event or a threshold being reached. For example, if three or more failed authentication attempts performed in succession are detected, the system is configured to generate an alert. Yet successful authentication attempts are mostly categorized as business as usual and ignored, even if they’re occurring at off times or from unexpected locations The current mean time to detect a breach is over six months. Most organizations have all the data they need to identify a breach much faster than that, yet they are still unable to detect and react to a breach in even a semi-reasonable amount of time. This is due to:
- The volume and velocity of the data being generated
- Not looking for patterns in all of the data available – the unknown unknowns
- Not having the proper context for the data available
If your system is ever breached, you don’t need to look at the failed authentication events – you need to look for anomalies in the successful ones!
Most organizations are well down the path on their journey of capturing and storing all of their data for future analytics in data Lakes, large repositories of raw data in any format. Capturing, storing and securing that data is key. Once the data is available, it can be analyzed and its value maximized using a variety of methods. This is where the fun (and benefit) starts!
On HPE NonStop servers, XYGATE Merged Audit (XMA) gathers, normalizes and stores security audit data from both the system and its applications. Merged Audit is your central repository for all NonStop security data. This is your NonStop Security Data Lake. In some environments, the data XMA gathers can amount to tens of millions of records per system, per day. With that kind of volume, you might think it’s nearly impossible to draw all of the value out of from this massive amount of data. This data can be fed to an external Security Information Event Manager (SIEM) or your Security Orchestration, Automation and Response (SOAR) solution for alerting, but most of it likely falls into that 73% that is treated as noise and does nothing but occupy disk space.
Be the first to comment