The scope of IT security analytics is broad. In an ideal world, threat intelligence, provided in advance, would prevent IT security incidents from occurring in the first place. However, complete mitigation will never be possible and incidents are inevitable, often with associated data breaches. Post-event clear up requires intelligence gathering too. The quicker that can be done the better; more chance of finding the smoking gun. The net result of trying to speed up incident response is an increasing capability to use intelligence as an event is occurring. As one vendor, Cisco’s Sourcefire, puts it: the need for security intelligence is “before, during and after” an incident.
In the past, there have been distinct products in each area, but the boundaries between them are blurring as the vendors involved extend their reach, in some cases competing with each other where they did not in the past, but also often co-operating to share intelligence. The more timely the intelligence can be gathered, the more likely it is that it will be put to use for pro-active defence, rather than post-event clear up; this is the area of real time security analytics.
First, let’s look at the before. Threat intelligence is still the life blood of the IT security industry. It includes black lists of common spam emails, malware signatures and dodgy URLs as well as white lists of known good stuff (applications you want your users to be running or websites you are happy for them to visit). All this is still a key part of protecting IT users and relies on the vast threat intelligence gathering networks that sit at the core of most IT security companies. Examples include Cisco’s Advanced Malware Protection (from its Sourcefire acquisition, now integrated across the Cisco security portfolio); the Symantec Protection Network; McAfee’s Global Threat Intelligence; and Trend Micro’s Smart Protection Network. All IT security vendors have access to such resources at some level; part of the power of these networks is that they are kept up to date by gathering intelligence from, and sharing it with, huge customer bases. However, many now accept that intelligence gathered before is never going to stop the most insidious threats. However good such networks are, unwanted security breaches will still occur.
So let’s now look at what may need to be done after; the worst case scenario, when an event has occurred and systems and/or data have been compromised. The requirement now is to understand the extent of the damage. This is the world of IT forensics: the preparing of reports for internal investigations, responding to regulators and, in some cases, communicating with crime investigators. Examples of relevant incidents include the discovery of unknown malware (which may, or may not, have been egressing data), evidence of hacking and, in some cases, the suspicious behaviour of employees.
Well-established vendors in forensics include Guidance Software, Access Data, Stroz Friedberg and Dell Forensics. In 2013, Guidance released a new version of its Encase product called Encase Analytics. Ultimately, many of the clues to what has happened lie on the servers, storage systems and end user devices, so whilst Encase Analytics is a network based tool, these end points are its focus. The volumes of data involved can be huge and as Guidance puts it, this is where “big data meets digital investigations”.
To complete its reports, Encase Analytics needs kernel-level access across multiple operating systems to inspect registries, system data, memory, hidden-data etc. Network and security appliance log files are also of use. To access information from these, Guidance can take feeds from SIEM (security information and event management) tools, which are discussed below. Guidance has hundreds of enterprise customers that use its tools; one of the benefits is to be able to offer ready-customised reports for specific regulatory regimes such as PCI DSS, the UK Data Protection Act and the mooted EU Data Protection Law.
Access Data’s Cyber Intelligence and Response Technology (CIRT) provides host and network forensics as well as the trickier-to-address volatile memory; processing data collected from all these areas to provide a comprehensive insight in to incidents. With some new capabilities, Access Data is re-packaging this as a platform it calls Insight to provide continuous automated incident resolution (CAIR). The new capabilities include improved malware analysis (what might this software have done already; what could it do in the future?), more automated responses (freeing up staff to focus on exceptions) and real time alerts. This is all well beyond historical forensics, moving Access Data from after to during and even some before capability. Like Guidance and other forensics vendors, Access Data relies for some of it intelligence from SIEM vendors.
In the past SIEM has typically been an after technology too. Most SIEM vendors come from a log management background, which is the collection and storage of data from network and security system log files for later analysis. Many of the major IT security vendors have entered the SIEM market via acquisitions; HP of ArcSight in 2010, IBM of Q1 Labs in 2011, McAfee of Nitro Security in 2011, EMC’s RSA of Netwitness in 2011 and KEYW’s Hexis of Sensage in 2012. Other vendors include LogRhythm, Red Lambda and Trustwave. Splunk is often included in list of SIEM vendors, but its focus is even broader, using IT operational intelligence for providing commercial as well as security insight. Yhis is the subject of a new Quocirca report, Masters of machines, sponsored by Splunk.
As with forensics, the volumes of data are so big that SIEM is increasingly referred to as a ‘big data problem’. It fits the definition well, if you go by the five Vs of big data; volume, variety, velocity, value and veracity. There is certainly lots of data involved (volume) and it comes from a range of sources (variety), often being enriched with data from other sources (for example, user and device information, content classifications data and indeed, threat intelligence networks). However, it is the increasing capability to use SIEM data in real time that ticks the velocity box and this is turning SIEM in to a during technology. Quocirca covered this in its 2012 report Advanced cyber-security intelligence, sponsored by LogRhythm. Anything that minimises the impact of security incidents clearly has value and veracity comes from the truth expose through deep insight.
To use intelligence from a range of sources in real time to identify and mitigate threats as they are occurring is the Holy Grail of IT security. Of course, there are plenty of measures that can be taken, running suspicious files in sandboxes (witness the rapid growth of FireEye, a leading vendor in this space), only allowing known good files to run (for example with white listing technology from Bit9, another vendor that has upped its ante for the during with its recent merger with Carbon Black), blocking access to dangerous areas of the web, which is a constantly moving goal (URL filtering from Websense, Proofpoint and others) or judicious checking of content in use (content inspection and redaction from Clearswift and other vendors in the data loss prevention/DLP space).
These are all point products that help towards the boarder aspiration of real time mitigation. Supplementing these with analytics across a wide range of sources during an attack provides more extensive protection. Examples include:
- Identifying unusual traffic between servers, which can be a characteristic of undetected malware searching data stores.
- Matching data egress from a device with access records from a suspicious IP address, user or location.
- Preventing the non-compliance movement of data (which may be simply down to an employee being ignorant of the rules).
- Linking IT security events with physical security systems (for example, maintenance of plant infrastructure may be restricted to certain employees that are known to be on the premises).
- Identifying unusual access routes, for example some databases are only normally accessed via certain applications and not directly by users.
So, in general terms the news is good; the vendors that aim to protect IT infrastructure are upping the ante in the arms race with attackers (and it always will be an arms race). More and more are making use of their ability to process and analyse large volumes of data in real time to better protect IT systems. The bad news is, there is no silver bullet and there never will be; a range of security technologies will be required to provide state of the art defences and there will be no standing still. Those who would steal your data are moving the goal posts all the time and they will be doing that before, during and after their attacks.