Salesforce.com President and COO Bret Taylor believes the SaaS vendor has made “probably the most significant technological shift” since its CRM platform first launched 21 years ago.
But Taylor wasn’t talking about the company’s agreement to acquire enterprise collaboration hub Slack for $27.7 billion: that came the day before.
Rather, he was talking about Hyperforce, the fruit of a two-year project to re-architect the Salesforce platform from the ground up to work on public cloud infrastructure all around the world, enabling enterprises to choose where their data is hosted.
“There’s a ton of really cool technical details that I’d love to get to right now, but I want to focus on how it’s going to help you, our developers, our admins, and our customers,” Taylor told viewers of the company’s Dreamforce online customer event on Dec. 2.
“Whether you’re a small business or you’re a multinational in a heavily regulated industry like financial services or the public sector, with Hyperforce you can make Salesforce your engine for growth,” he said.
Daniel Newman, principal analyst at Futurum Research, put it more succinctly: “Salesforce is finally saying, ‘We know the world is hybrid, and we are going to enable our users to embrace that.’”
Salesforce, as a CRM software-as-a-service provider, has hosted its customers’ data about their customers in its own cloud for years.
Over the years that has spared enterprises a lot of worry about managing data centers and network infrastructure. But as governments around the world tighten up data protection legislation and make data sovereignty an issue of national security, knowing their data is in the cloud is no longer a reassurance for many enterprises. They want more control over which portions are hosted on premises, which in the cloud, and which in both, according to Newman.
First mover disadvantaged
Being one of the first companies into the cloud left Salesforce with some disadvantages, according to Holger Mueller, principal analyst at Constellation Research. Its choice of Oracle’s database and its own APEX programming framework made sense at the time but lack flexibility today, he said.
Salesforce had to build its own cloud infrastructure because infrastructure-as-a-service players such as Amazon Web Services, Microsoft Azure or Google Cloud Platform weren’t around back then.
With Hyperforce set to enable the move from its own infrastructure to IaaS, Salesforce can switch much of its cost base from capital expenditure to operational expenditure, leaving it better able to respond to the ups and downs of its customers’ businesses, Mueller said.
“Moreover, modern application architectures need the cheap compute and storage in the cloud to allow to run AI/ML and big data processes,” something not possible on Salesforce’s first-generation infrastructure, he said.
The devil in the details
Mueller and Newman are still waiting for Salesforce’s Taylor to get to those cool technical details he hinted at.
Newman noted, “They haven’t announced which clouds they’ll run it in yet.”
For Mueller, “A lot more information is needed, like what products run in what country on what IaaS, so Salesforce has to communicate a little more.”
Salesforce’s Taylor did offer a few snippets, including that Hyperforce will enable customers to choose where they store their data and to access compute capacity flexibly, according to their needs. “It’s live in India, it’s live in Germany, and we’re rolling out in 10 countries next year,” he said. He also referred to partnerships with “all of the amazing public cloud companies around the world” that would allow the company to deliver service in every region.
In addition, every Salesforce app, customization, and integration will run on Hyperforce, regardless of cloud, he said: “It’s 100% backwards compatible. Your apps will work with no changes. You can benefit from all of this automatically.”
The security of apps running in the public cloud can be a big source of worry for CIOs, but Taylor said that with Hyperforce, “We built trust right into the platform.” In practice, that means that there will be limits to the data users can access, and that data will be encrypted in transit and at rest.