Understanding the impact of the data center autopilot \nCurrent state of the art and my disappointment with traditional databases aside, I mentioned in my comments last week that the data center autopilot will have big consequences. It seems to me that there is not enough recognition of the likely impact. The tactical observations are that automation will reduce people costs, at least on a per-workload basis, and that automation will:\n\nMinimize over-provisioning,\nHelp reduce downtime,\nHelp to manage SLAs, and\nImprove transparency, governance, auditing and accounting.\n\nThat is all true, but it\u2019s not the big story: The overall strategic impact is to significantly accelerate organizational velocity. The acceleration is partly as a result of the above efficiencies, but much more importantly as a consequence of automated decisions being made and implemented orders-of-magnitude faster than manual decisions can be. Aviation autopilots do things that human pilots are not fast enough to do. They are used to stabilize deliberately unstable aircraft such as the Lockheed F-117 Nighthawk at millisecond timescales, and deliver shorter flight times by constantly monitoring hundreds of sensors in real time and optimally exploiting jetstreams.\nData center automation and time to value \nOne reason we miss this enormous shift in our general IT discussions is that at least since the industrial revolution and the Luddite resistance, automation carries the primary associations of excluding people in order to reduce costs and errors. Those instincts may be quite correct in this case, but they tend to obscure the bigger observation that people slow things down. We learned the lesson from automated trading systems if we did not already know it: speed matters, and automation enables speed.\nSoftware systems now drive our primary value creation and maximization activities in areas as diverse as core business transactions, product design, business process optimization, market analysis, relationship management, professional collaboration, communications, and R&D. Collapsing the time-to-value of enhancements to the relevant systems not only has immediate impact on effectiveness, with magical compounding effects over multiple iterations, but additionally as the cost of experimentation plummets, it enables A\/B testing and fail-fast strategies. Organizations will get better at rapidly iterating their cycles of software development and IT operations (devops) and consequently they will become more creative, more risk-tolerant, more responsive to market events, more data-driven and will evolve more rapidly. If every company is a software company, then the winners will be those that are best at rapidly maximizing value from software innovation.\nSo I find it hard to avoid the conclusion that every company should be moving towards autopilot-style datacenters as fast as possible, in whatever mix of on-prem and public cloud makes sense for you. And those of us building enabling technologies, databases in our case, need to prioritize the programmable manageability of our systems. Organizational velocity will be hugely accelerated by doing this right.\nAnd maybe our system administration professionals can become meta-pilots, as the aviation guys have, while the datacenter autopilots do the humdrum work.