What’s logs got to do with IT?

Chima Njaka is based in Palo Alto, California and has broad experience in developing, selling and marketing technical solutions that include scalable log and security intelligence platforms for the enterprise and cloud. Chima takes pride in helping customers work with terabytes and more of big data coming from their own IT infrastructure. The following is based on a presentation he gave at TUCON2012.

Many of the conversations about Big Data focus on information flowing into the organization from somewhere else. It isn’t as widely discussed that there is an enormous amount of information coming out of every enterprise’s IT infrastructure that is just as critical, offers enormous insights and is just as time sensitive. Big data isn’t complete without log data.

What is log data?

Log data, effectively, is like non-stop tweets coming from IT assets and is generated by almost every element within an enterprise’s infrastructure. By managing this data proactively instead of just when something goes wrong, organizations mitigate risk, ensure service availability and promote operational efficiency.

This data provides an immutable fingerprint of user and system activity that can be at the lowest level a failed logon, and at the higher levels, a significant diversion from baselines, runaway application or an actual security breach. Logs leave behind a track that can be followed to answer questions like, “Who did what and when?”, “Are we following regulations?”, “Is our network performing optimally?”, and “Is our data safe and secure?” These are all critical to business operations and
can bring down an organization that isn’t paying attention.

Getting specific, log data gives us a view into:

  • Threat management – Logs contain the evidence of an security events but also provide information before and after an attack begins that can be used to head off the problem as it happens
  • Regulatory compliance controls – Log data contains the evidence that supports PCI DSS, HIPAA, SOX, ISO and other audits by demonstrating internal and external policy adherence. Dashboards contain data that shows where and when compliance is being met, allows the organization to put effort where it is needed to shore up requirements.
  • Cloud auditing – Cloud computing is getting more complex and finding more uses. This kind of growth needs to be monitored and managed to ensure everything works as advertised.
  • Technology utilization and performance – Operational performance monitoring is key to getting the most out of enterprise assets.

Truly big data

Log data is managed physically, virtually or in the cloud and is enormous. According to Gartner, a medium-sized enterprise creates 20,000 messages per second of operational data in activity logs. In a single, 8-hour day this comes to 500 million messages, adding up to more than 150 GB of operational data. Without automation technology, collecting, moving and analyzing that data is impossible. There has to be a big filter for this big data that can sort through and pass key events to other systems to be used to manage opportunities, threats and efficiency in the best ways possible.

The equation becomes 1 + 1 = 3 when log data can be blended real-time with loyalty, supply chain, marketing, ERP, social and click stream information. If you’re not managing log data, what’s hiding in your logs?


Tags: , ,

Categories: Data Analytics / Big Data

Author:Jeanne Roué-Taylor

I'm fascinated by disruptive technology and its impact on our world. I manage sales operations for an excellent startup with a unique team of highly experienced data scientists.

Subscribe to the blog

Subscribe and receive an email when new articles are published

3 Comments on “What’s logs got to do with IT?”

  1. November 3, 2012 at 11:32 am #

    Hi Jeanne and Chima,

    Thanks for pointing out the power of log data!

    Perhaps it is interesting for you to know that there is a field called ‘Process mining’ (see http://www.processmining.org/ and http://fluxicon.com/) that takes precisely such log data as the starting point to find out how processes are actually executed, whether regulations are followed, where the performance can be improved, etc.

  2. Jeanne Roué-Taylor
    November 3, 2012 at 11:38 am #

    Process mining is a fascinating idea but I don’t have examples of where it’s being done or the value it’s providing. Could you cite some of this information here? Thanks for your comment.

  3. November 3, 2012 at 11:51 am #

    Thank you for your reply. Sure, for example, you can take a look at this case study from analyzing the service process of an electronics manufacturer http://fluxicon.com/s/casestudy3/. To get a better idea of which kind of data is needed to do process mining, you can read our blog post on data requirements for process mining here: http://fluxicon.com/blog/2012/02/data-requirements-for-process-mining/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: