Data-driven decision making (DDDM) is just what it sounds like. DDDM uses facts, metrics, and data to guide strategic business decisions that align with goals, objectives, and priorities. If organizations can realize the full value of their data, everyone is empowered to make better decisions.\n\nWhen you look at why IT analytics and DDDM exist, it\u2019s because executives often make decisions based on a hunch. Sometimes, maybe often depending on the executive and the context, their hunches are correct.\n\nFor example, Fred Smith has an insight into the transport business and, despite widespread skepticism, creates Federal Express. Michael Eisner hears a pitch for an offbeat game show and, based on his gut, commits millions to developing Who Wants to Be a Millionaire?\n\nBut gut instinct is not how we want to consistently operate an enterprise. We don\u2019t want to make decisions that way. Data is a much more reliable foundation for decision making.\n\nAccurate data is fresh, timely data.\n\nIn IT, if data is even a week, let alone three months old, you\u2019d be better off licking your finger and holding it up in the wind rather than deciding based on data that old. Ninety days ago doesn\u2019t tell you where your applications are, where your workloads are, or where your customers are. And it tells you nothing about potential risks from cyberattacks.\n\nAn awful lot of IT analytics is garbage in, garbage out because old data, by the time it\u2019s used, has become false data. So, you perform intelligent analytics on it to achieve a conclusion that\u2019s no better than the false data you started with.\n\nFor example, imagine a hospital that hadn\u2019t updated its configuration management database (CMDB) in 90 days. That\u2019s like flying an airplane on 90-day-old instrument data. And that really understates the problem.\n\nPilots don\u2019t have to worry about a new mountain or a new skyscraper popping up every couple of weeks. But in IT, the equivalent of a new mountain can emerge in hours or days.\n\nWhat types of decisions are informed by accurate endpoint data?\n\nThere\u2019s a hierarchy of data-driven decisions in operations, security, and compliance, but let\u2019s start with one that\u2019s operational. In most organizations, IT refreshes software based on an alert from the vendor or the help desk receiving a high number of complaints.\n\nThe vendor usually alerts its customers when it\u2019s time to update because of a recently discovered vulnerability. It\u2019s often a \u201cpanic\u201d alert. But it\u2019s the rare IT team that has enough personnel to update every piece of software that needs it. And it wouldn\u2019t be a good way to go. There would likely be downstream instability from any change you made. So, the decision to update or not becomes subjective, not data-driven.\n\nBut with the right tool, you can know, second by second, every application crash that happens across all your enterprise applications. Having that data in real time means IT can say, \u201cThe vulnerability hasn\u2019t made the top ten of complaints this week, but we know this application is crashing and we\u2019ll fix it centrally.\u201d Knowing second by second what\u2019s crashing, what\u2019s degrading CPU performance, and\/or what\u2019s blue-screening, lets IT make a decision that\u2019s also a business decision.\n\nIn some situations, seconds-old data matters, especially with a distributed workforce. You want to be able to see instantly the vulnerabilities of every endpoint. There may be too many to fix, but when you know where they are and how critical, you can make informed decisions about which ones to address. For example, there may be corporate network mitigations in place, but users at home are in the \u201cWild West,\u201d potentially exposed to every attack.\n\nJust what is \u201cfresh data\u201d from an IT perspective?\n\nThe importance of data freshness is not uniform across IT operations. For example, if hardware is on a low refresh cycle, such as two or three years, it\u2019s not significant if CPU or hard drive model data is a month old. But if you\u2019re making decisions about retiring servers or migrating workloads from a physical to a virtual environment, data that is days old will very likely cause problems. You could be retiring a server a business unit depends on or moving workloads that support a critical service.\n\nWith up-to-the-second \u2014 or at least up-to-the-hour \u2014 data, you\u2019re in a much better position to act.\n\nIT analytics and digital transformation.\n\nDigital transformation is like Zero Trust. It means different things to different people. Ask 10 engineers, and you\u2019ll get 12 different answers. One aspect of digital transformation is the mobility and centralization of data. It allows organizations to switch application and service providers because the data and the service have been decoupled.\n\nBut if you look at where digital transformation efforts have \u201cgone south,\u201d it\u2019s often in the realm of process and not knowing what servers and endpoints are communicating with to enable a business service. And this is where the timeliness of data and digital transformation intersect.\n\nFor example, if you have the ability to crawl through every .txt file, PDF, Word doc, and Excel spreadsheet on a laptop to find something that shouldn\u2019t be there but should be stored centrally, it\u2019s much easier to switch your central storage provider.\n\nAccurate data removes the risk from that decision. That\u2019s how fresh data increases agility. If it takes months of effort to move from on-premises to a hosted system, or from one host to another, the friction and cost of transfer are so high you won\u2019t do it. With more agility and less friction, digital transformation efforts become a buyer\u2019s market.\n\nLearn how to make better business decisions with accurate, complete and up-to-date data about all endpoints \u2014 wherever they are.