Answering critical questions about what’s happening in the data centerrn In large data centers, operations analytics tools can process tens of millions of IT events every day. These events stream in from security and performance monitoring tools, event logs, network devices, compute platforms, storage systems, and countless other sources. That’s the world we are in today, and it may soon seem like the good old days. That’s because we are heading into a time when the IT shop will get hit with hundreds of billions of events every day—an unfathomable amount of data to process. In an earlier post, I explored the notion that this new scale of compute will demand a new scale of analytics. In addition, we’re going to need a new class of IT analytics tools to glean insights from all of that data in real time. These tools are going to be critically important when it comes to maintaining the security, performance, and availability of IT services. In the years to come, we’re going to look to these tools to answer critical questions about what’s happening in the data center. Analytics is going to be the IT manager’s main view into the end-to-end environment, including things happening in distant clouds. Let’s look at a few examples of the questions that analytics will help answer. Are there new security threats that I am blind to? We’re in an age of ever-more-sophisticated threats against data centers—threats that can have catastrophic consequences. With today’s Advanced Persistent Threats (APTs), malicious software may be in the environment for months without being detected. All companies are now caught up in a constant fight to defend the data center. As the scale of compute grows, analytics tools are going to be even more present on the front lines of the battle. In particular, operations teams are going to need tools that identify security anomalies in real time and recommend defensive steps—before customers or law enforcement teams detect breaches. Existing threat intelligence tools can help, but it’s both expensive and difficult to analyze data at such a large scale to get actionable insights. And this challenge is going to get a lot harder as we move from analyzing tens of millions of IT events per day to hundreds of billions of events. Our only hope here is the arrival of a new generation of IT analytics technologies. How well are my cloud-based services performing? A big barrier to cloud adoption is the lack of insight and visibility into the cloud environment. With many cloud services, it’s hard to see what’s really going on out there, and that is simply unacceptable from a management standpoint. To move confidently into a cloud environment, IT managers need cross-environment monitoring that provides a clear view into the performance of the services running in the cloud. Analytics that operate at the scale of a cloud environment are going to be an essential delivery vehicle for this visibility into cloud-based services. Is something in my environment on the verge of failing? Given the rising costs of downtime, data center operators are going to need to make greater use of analytics tools to identify emerging issues with systems and facilities, so they can address problems proactively and avoid downtime. For example, analytics tools might detect a high rate of memory errors on a system or a high rate of write errors on a disk, either of which could indicate that a component could be on the verge of failure. At the facilities level, your analytics tools might alert you to a spike in the temperature in a particular part of your data center, which could suggest that a chiller or air handler is having problems. The goal is to get in front of problems like these, and that’s going to be possible only if you have analytics that operate at the scale of the data center of the future. From threat detection and performance monitoring to proactive system and facilities management, we’re going to need the help of powerful analytics software running on processors that have yet to be invented. There are, of course, many good IT analytics tools in the market today. The problem is that these tools, by and large, can’t deal with the scale of the data deluge that is about to hit IT shops and the growing complexity of our compute models, which will become an increasingly complicated mix of on-premises, off-premises, and cloud-based systems. I wish I could tell you that Intel has an off-the-shelf solution for the IT analytics challenges that are heading our way. The reality is, we don’t have any such solution, but I can assure you that this is a problem we are studying very hard and are extremely passionate about. And I think it’s also safe to say that in the months and years to come, we will work closely with our ecosystem partners to help enterprises solve this problem in a proactive manner. This is a huge challenge for all us in enterprises environments—but it’s a challenge that we can and will overcome. To see some examples of the power of analytics, visit The New Center of Possibility. Related content brandpost Sponsored by Intel An Opportunity to Deliver True Container Interoperability The Open Container Projectrn By Jonathan Donaldson Jul 29, 2015 3 mins Cloud Computing brandpost Sponsored by Intel Big Data Solutions Inside Future Urban Sustainability Outsidern By Ron Kasabian Jul 27, 2015 3 mins Big Data brandpost Sponsored by Intel Optimizing Media Delivery in the Cloud Video delivery is now an essential service offering - And a rather challenging propositionrn By Jim Blakley Jul 22, 2015 3 mins Video brandpost Sponsored by Intel Accelerating Business Intelligence and Insights Unleashing new business opportunitiesrn By Mike Pearce Jul 20, 2015 6 mins Business Intelligence Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe