When it comes to the digital workplace, the popular opinion, and fear, is that machines are encroaching upon human work activities and taking an ever larger percentage of this work away for good — from the dirty and dangerous, to the dull, to decisions. Fortunately, this doesn’t take into account the realm of possibilities created when work processes are reimagined in the context of mutual human-machine collaboration.
By instrumenting the human and socializing the machine, we can redesign business processes to optimize the blend of human-machine participation and interaction — and complete tasks far more efficiently than either could individually. Machines are stepping out from behind the cage, and humans are stepping into their worlds.
As Julia Kirby and Thomas H. Davenport have pointed out in “Beyond Automation,” rather than a zero-sum game, robotic automation can be thought of as augmentation, where humans and machines collaborate together to get work done. This is akin to a relay where the baton is passed between human and machine working toward a common goal, as opposed to a race pitting one against the other.
Clearly, for business managers, in addition to reaping the benefits of this collaboration, it’s also important to have a clean hand-off of the baton every time to optimize the process even further.
Since humans are driving the innovation around automation and robotics, we’re consciously (and perhaps unconsciously) carving out our future roles in the workplace side-by-side with machines. Rather than harsh boundaries between humans and machines, we’re creating a converged future where work processes are being optimized in two converging directions: instrumentation of human processes and socialization of machine processes, so the two can work in greater harmony together.
Mapping the division of labor — human-machine collaboration
If we analyze this collaboration, we can see several distinct classes of work activity where either machines augment human processes, humans augment machine processes, or both.
To illustrate the types of collaboration that can occur, it’s useful to think of who performs the work in terms of human or machine, and whether the work is delivered physically or virtually. The human-machine scenarios include “physical-physical,” such as caregivers working with smart mobile robots to deliver medicines and supplies in hospitals; to “physical-virtual,” such as warehouse employees using smart glasses for navigation and picking instructions to boost productivity; to “virtual-physical,” such as doctors performing telepresence surgery; to “virtual-virtual,” such as call centers with human agents working in tandem with virtual cognitive agents.
Interestingly, whether humanoid or non-humanoid, and whether working physically or virtually, the various robots concerned are all being socialized to perform their unique tasks most effectively. Physical robots are being socialized to operate seamlessly within human spaces and adhere to human behavioral norms, and virtual robots are being socialized via their appearance and natural language capabilities.
The key point is that it’s not just machines that are getting social; it’s that humans are getting instrumented as well, all of which amplifies the possibilities to optimize work activities. Let’s look at some examples of both of these areas:
Instrumenting the human
As consumers, we’re all becoming instrumented and taking advantage of the wealth of wearables and sensors now on the market. This “quantified self” concept helps us monitor our health and fitness and take advantage of the masses of data that are produced as we go about our daily lives. The pace of instrumentation is picking up in the workplace as well as employers seek to track employee behavior and optimize work activities.
Steve Cousins, CEO of Savioke, a company that manufactures autonomous robot helpers for the services industry, sees smartwatches as well as smartphones as a powerful way for his team to connect into their robots currently deployed in trials at the Aloft hotel chain. This modern-day “telepathy,” as Steve sees it, helps employees monitor the robots at scale and intervene in rare cases of exceptions where the robots need a helping hand.
At the higher end of the spectrum in terms of wearables, initiatives such as the Q-Warrior helmet and the TALOS “Iron Man” suit in the military are instrumenting soldiers to radically improve their situational awareness, giving them super-human capabilities.
In virtual work scenarios, the “Double” telepresence robot, “da Vinci” surgical system, and manually piloted drones and UAVs are all examples of how we’re instrumenting humans to be able to extend our reach and conduct work in remote locations.
Socializing the machine
In healthcare, Aethon’s Tug robot is a smart autonomous robot that delivers medicines and supplies in hospitals. Having logged over 1 million miles in hospitals to date, it has been socialized to safely navigate around people and obstacles and can even take the elevator. One of the aspects of its socialization is that it makes use of existing hospital facilities and infrastructure and doesn’t need dedicated hallways or large docking areas.
Jim Lawton is chief product and marketing officer at Rethink Robotics. The company makes the Baxter industrial robot, well recognized by its iconic “face screen.” One of the many ways it has socialized its robots is by implementing anticipatory intelligence so the robot physically communicates where it’s about to move. It does this by moving its head, its eyes, and then its arm when reaching for objects to instill a comfort level with workers who are shoulder-to-shoulder with the machine.
Rethink is bringing instrumented workers into the equation as well. Its new Sawyer robot has a second screen capability where supervisors can use a tablet to gain access to robot performance across the fleet. In addition, Lawton and the team are exploring how operational data gathered from “smart robots and smart workers” can be analyzed to further optimize work cells in manufacturing.
IPsoft’s Amelia is a cognitive virtual employee that speaks, reads, writes and learns on the job, just like a human employee. According to Jonathan Crane, the company’s chief commercial officer, she has emotions and intelligence, so she understands what people ask, even what they feel, and can empathize in both her facial and spoken response. If Amelia doesn’t understand a request, she transfers the caller to an agent, then observes and learns so she can do it herself the next time. Amelia is being used in call centers to transform the labor mix with “virtual engineers.” This helps “automate the tactical, and populate the strategic,” according to Jonathan, letting staff focus on higher-value-added projects and typically reduces labor costs by 30% to 35%.
Even when we look at fully autonomous vehicles such as self-driving cars, the cars are being socialized to be overly cautious when maneuvering to help avoid surprises for passengers and pedestrians alike. Soft robotics is another area of innovation where robots are being designed with soft and deformable structures to work with unknown objects, in rough terrains, or with direct human contact.
Implications for managers
So, given these two converging themes of instrumenting the human and socializing the machine, what are the implications for managers?
Over time we’ll start to see more scenarios toward the center — more highly instrumented humans and more highly socialized machines working together. The business enablers won’t be just one technology. The instrumented human will combine sensors, wearables and AR with powerful analytics delivered via the cloud, while the socialized machine will combine sensors, geolocation, expressive behaviors, empathy, knowledge, memory and speech.
We can also expect human-machine collaboration to operate at scale. We’ll see one-to-one interactions, as well as one-to-many and many-to-many interactions as benefits are scaled up across the enterprise. Google’s recent patent envisioning cloud control of an army of robots is just one example.
The specific human-machine scenarios will change over the duration of a particular customer’s journey or a worker’s business process. For example, within an airport setting, an instrumented passenger will use a smartwatch for notifications and way-finding, and will interact with an array of robots including self-service bag drop machines, robotic butlers to handle luggage, and even robotic valets to park his or her car.
Adriaan den Heijer, senior vice president of hub operations at KLM, sees numerous applications where robotics can play a role from the passenger side, where we can find answers by combining robotics together with the human touch, to baggage handling robots which free up employees to become process operators.
The key for managers in the years ahead will be flexibility and finding the sweet spot for human-machine collaboration based on the nature of the work. Anywhere on the continuum is fine in terms of human instrumentation and machine socialization, and the precise location is task-dependent.
Workers will need to be adaptable for these different scenarios with machines, just as adaptability is paramount when working with other humans. Machines will need to be highly adaptable as well, as evidenced by the recent winner of the DARPA robotics challenge. For managers envisioning the future digital workplace, consider the full range of options available — instrument and socialize whenever possible to create smooth, optimized interfaces where you can hand off the baton in record time.
Nicholas D. Evans is the Chief Innovation Officer at WGI, a national design and professional services firm. He is the founder of Thinkers360, the world’s premier B2B thought leader and influencer marketplace as well as Innovators360.