Intelligent automation technologies, including robotic process automation (RPA) and artificial intelligence (AI), offer transformative opportunities for companies to shift the ways organizations do everything from running operations, moving through the supply chain and serving customers.
But making decisions about digital labor — that is, an automated workforce with capabilities to complete work that largely mirrors our own abilities — cannot be taken lightly. These efforts can have enormous and lasting effects on your workforce, on communities and on the entire world — so they require significant thought and preparation, including digging into a company’s deepest core values.
“Companies need to monitor the evolution of digital labor in order to guide their decision-making,” says Todd Lohr, principal, U.S. Intelligent Automation Leader at KPMG. “They have to think about the profound impact of technology on their business, from how the work is done to who is doing the work, job replacement considerations, to the evolution of jobs and how it affects the work environment.” He believes these efforts are the next wave of corporate sustainability, as more and more research shows that buying preferences are based on the perceived ethics of organizations.
The IT organization has a pivotal role in this, adds Cliff Justice, principal, Innovation & Enterprise Solutions at KPMG. The CIO is often charged with ensuring new technology is in line with company values; to ensure data systems are fed are not compromised and that data would not lead systems to learn the wrong things. “The biggest mistake companies can make is to not include IT and CIOs in these decisions from the outset,” he says.
In a new paper authored by Lohr and Justice, “An Ethical Compass in the Automation Age: Decisions Require Deep Dive into Company Core Values,” they hone in on a variety of ways organizations can begin to tap into company core values and processes or create new ones as the ethical compass to guide automation decisions, including:
- Start the ethical discussion within the company. Have frank conversations about the potential impact of each decision. The recent Corporate Responsibility Survey sponsored by Aflac found that 83% of professional investors are more likely to buy stock in companies well known for social responsibility believing these companies are lower risk investments.
- Define or update the company’s core values. Some companies talk about taking care of their employees and providing value for customers as a part of core values, or focus on safety, environmental and community impact. Making these tough automation decisions pushes existing values and processes to the limit, so explore expanding on these to encompass major business model changes with digital labor.
- Consider how your core values extend into your technologies. It is critical to encode company values in the technology, because artificial intelligence and data can have bias that can contradict your core values and beliefs. “It’s important that the core basis of ethical decision making be built into a company’s algorithms — so it is not based purely on mathematical logic or statistical data, but instead what is deemed appropriate in terms of societal norms,” adds Cliff Justice.
- Follow through by establishing metrics to track the residual effects of automation. Many organizations create a Center of Excellence to manage governance. More important is establishing organizational change management programs for employees to help workers learn how to work with new technologies.
Overall, companies need to determine their overarching strategy to deal with the ethics of automation: “They need to weigh some of what they’re hoping to get out of it,” says Lohr. “There are short-term and long-term impacts on the operating model and a lot of decisions will hit corporate sustainability standards, enterprise strategy and overall corporate policies.”
Paving New Ground with Automation: The Early Tip of Change
“Digital labor and automation efforts are new and quickly-evolving, so companies are seeing only the early tip of the disruptive changes that will unfold over the next generation,” says Justice. As a result, discussions about the ethics surrounding these new technologies are just beginning. “We’re paving new ground, and there isn’t a playbook, or a lot of case studies out there,” he says. “We have never had a wave of artificial intelligence sweep through mainstream business in the past — that’s the reason we’ve decided to share what we’re learning as we go through this in our own enterprise.”
All companies, however, need to start addressing these dilemmas, as automation-centered industry disruptions happen faster and more often: “More and more CIOs feel the urgency to be the disruptor as opposed to disrupted,” he adds. As companies deal with a new class of technologies on a new class of platforms and offer services and products that have traditionally been offered through different business models, organizations need to consider how they want to work. Issues related to automation and digitization overall will be some of the biggest decisions CIOs have to face.
“With automation, organizations need to start thinking further ahead about technology, people and culture than in the past, because the disruptive impacts are so significant,” says Justice. No industry is immune to new operators coming in on cloud platforms and moving into traditional businesses. “They have to determine the extent to which they will transform and protect their business growth while maintaining their culture and taking care of employees and customers, as well as maintaining their brand and values in the market,” he adds.
According to the paper, Justice and Lohr agree it’s up to companies to have these discussions and make tough decisions regarding the ethics of automation: “We believe in the power of corporate leaders to make the right choices. With the right tools, knowledge and attention, technology can be the great enabler. But company and personal ethics must serve as the compass. You are the steward of powerful technology. It’s up to you to use it right.”