by Nicholas D. Evans

Be careful what you transform: The unintended consequences of digital innovation

Dec 11, 20144 mins
Digital TransformationIT LeadershipRegulation

There’s been a number of cases in the press recently related to intended crimes and un-intended abuses of, or issues surrounding, digital business models and processes. These cases range from illegal business models such as underground marketplaces and content-sharing sites, to those competing with and creating controversies with their physical counterparts, to those prone to user abuse and thereby exposing other users to potential fraud or harm.

In a recent blog, The hidden disruption of digital business models, I mentioned that by digitizing a traditionally analog business model or process, we’re effectively turning it into bits and atoms and enabling an infinite variety of possibilities. The rules can be whatever you want them to be – with the market being the petri dish to determine if the new rules are viable and can lead to adoption and growth.

The caveat is that there are no shortcuts. You can design a new business model or process, but once implemented it will be subject to the usual good and bad actors. These actors may be internal to the “system,” so to speak, such as authorized employees and end users, or they may be external, such as cybercriminals.

While shortcuts are often the goal in re-thinking and re-designing business processes to create process efficiencies, cost savings, and an innovative and highly convenient new digital customer experience, every business model and process innovation must still carefully navigate the waters surrounding legality, privacy, security and safety. The bar is typically that it must match what’s done in the physical world (e.g. background checks), or even be superior due to issues with the anonymity of the Internet.

Even if your design is totally sound, you’ll want to consider the full range of use cases and how your system might be intentionally or unintentionally misused. As digital transformation becomes more widespread across a broad spectrum of industry business models and processes, we can expect to see more cases of unintended consequences or where the rules are deliberately manipulated or broken – either by the operators or end users.

Today’s disruptive technologies such as IoT, wearables and big data simply amplify the risk levels. With IoT, for example, the threat level increases as IoT devices become more controllable and more autonomous. In these latter cases, cybercriminals can exploit vulnerabilities to remotely control IoT devices to change sensor or device behavior, to sabotage these devices, or even inflict physical damage on the surrounding environment.

While much needed attention is paid to addressing cybersecurity vulnerabilities, in 2015 organizations should dedicate additional energy towards the broader risk management picture. Ask what safeguards and countermeasures you can build into your business model and processes to improve your risk levels. This may involve the addition of physical as well as digital techniques. It might be stronger background checks, or stronger forms of authentication (perhaps even via biometrics), or it could be fraud detection algorithms to detect when systems or data are being purposefully manipulated.

An interesting example where fraud detection algorithms may find additional value is related to the data collected from wearables and IoT. A recent Computerworld article explored how data from wearables is being used to support a personal injury claim. Imagine if a criminal deliberately altered or manipulated their activity data to provide themselves with an electronic alibi, or if a motorist attempted to manipulate the data from the device measuring their driving habits to reduce their monthly premium.

Digital business models have the potential to create powerful “win-wins” for businesses and consumers alike, but all parties need to be aware of the real-world risks and act accordingly – either to ensure legal and ethical service delivery or to protect themselves from potential harm.