Resource-Intensive Analysis at the Push of a Button

Big Data and the cloud can enable computing-intensive analyses without companies needing to allocate their own IT resources.

The process of deploying Big Data and cloud solutions delivers unprecedented abilities to conduct analysis flexibly and quickly, as well as identify and respond to business trends. Organizations can minimize investments in data center hardware by conducting cloud-based analysis of Big Data.

The financial and technical research firm Technology Business Review said in Network World recently that Big Data analytics are driving rapid growth for public cloud computing vendors, with revenues for the top 50 public cloud providers shooting up 47% in the fourth quarter of 2013. Companies are increasingly turning to public, private and hybrid public/private cloud infrastructure to reduce the cost of computing while simultaneously increasing agility.

Large-scale deployments of Big Data for resource-intensive analytics are delivering significant returns on investment. For example, the Wall Street Journal reports that United Parcel Service (UPS) will soon move to dynamic package routing based on the packages to be delivered, customer preferences, and real-time traffic information. Routes planned with this new system will have fewer stops, and UPS estimates that every mile saved per driver per day for a year will save the company $50 million.

To avoid the cost of building out data center infrastructure to support Big Data analytics, organizations can take advantage of the efficiencies of integrating Big Data and cloud resources. For example, one IT conglomerate had outdated processes to optimize procurement, which resulted in an 18-hour lead time for reports, dissatisfied suppliers due to overdue payments, and decreased free cash flow.

The breakthrough arrived with Big Data processing in the cloud based on Dynamic Services for SAP HANA from T-Systems. This company was able to reduce its lead time to two hours, gain more than $2 million in savings thanks to increased transparency, and increase free cash flow 10%. Relying on Big Data analytics in the cloud allowed this IT conglomerate to increase agility and automate resource-intensive analysis.

The cloud is becoming essential for meeting Big Data scalability requirements. The number of data sources that companies need to analyze is continually growing. For example, ERP applications, CRM applications, machine data, production data, and data from social networks are just some of the many sources of information mined for valuable insights that are capable of improving business operations. Various departments throughout the organization, such as Marketing, Sales, Finance and Manufacturing, all want analysis performed so they can do their jobs better.

Unfortunately,  the necessary technical resources are often lacking to conduct the data analysis requested by departmental users; the existing IT infrastructure may not yet be prepared for the optimized use of Big Data. But by integrating cloud resources and Big Data, the enterprise can conduct resource-intensive analysis more swiftly while increasing business agility. For additional examples of using Big Data and the cloud for resource-intensive analysis initiatives, read more about T-Systems solutions for Big Data in Logistics, Efficient Fleet Management and Smarter Procurement.

Join the discussion
Be the first to comment on this article. Our Commenting Policies