by C.G. Lynch

R & D and Web 2.0

Opinion
Mar 16, 20073 mins

I had an interesting day at IDC’s Directions Conference in Boston this week in which Don Tapscott, author of Wikinomics, delivered the keynote speech. The conference’s common theme was disruptive innovation (which I still like to call disruptive technology since “disruptive innovation” feels a little too euphemistic to me, like when automotive CEOs call massive lay-offs a “restructuring”).

In his address, he highlighted the social and economic revolution that has occurred during the web 2.0 movement, and more importantly what effects it will have moving forward on IT as we enter an era when firms collaborate both within and outside their boundaries to foster better innovations. (I talked about these themes with Tapscott when he stopped by for a Q & A recently to discuss Wikinomics).

But one thing he talked about in his speech made me do an about-face (even as the firm believer in web 2.0 that I espouse myself to be). He noted how, during the next several years, we’ll gradually be moving to world where almost all work can be done over a web-browser, where we’ll have all the tools we need at our fingertips without the need for, say, a Windows  operating system.

Sitting in the audience, I imagined what life would be like to walk into Target, plop down $400 for a cheap laptop with Linux and a web-browser and be on my way. It got me thinking: Has the collaborative movement, where we constantly make incremental improvements to existing technologies, caused us to level off in terms of how we research and develop new inventions?

Recently, I’ve been doing some research about IT R & D. The Economist had a great report in which it highlighted how the big research labs that came to typify technology innovation (like a PARC, for instance) have either been marginalized or shut down all together, with firms deciding to focus on software and services, and existing technologies rather than hardware.

Probably the paragraph that stuck out the most:

“On the surface, American innovation has never been stronger. American firms spend around $200 billion on R&D annually, much of it on computing and communications. Microsoft, for example, spent around $6.6 billion last year; IBM and Intel about $6 billion each; and Cisco Systems and Hewlett-Packard (HP) around $4 billion each. Most of this money went into making small incremental improvements and getting new ideas to market fast.”

So I thought: What does the rise of Web 2.0 and increased collaboration mean for research and development? As I type this, I look at my computer and wonder, is this what a computer will always look like?  Will engineers just keep tinkering with the iPods or with a BlackBerrys until we get them just right? Will every function we want performed be done through Firefox? Is this it?

Then I took my Acid Relux meds and realized this isn’t the case.

Whet we see happening is that research and development of technologies is becoming decentralized as a result of the collaborative movement as facilitated by open source development and web-based applications. Best of all, it’s creating real wealth at companies that aren’t only“internet companies.”

What does this shift in R & D mean to you? Does it anger you that gone are the days when we threw a bunch of smart people into a laboratory and hoped for the best ? Or is this great?

For now, I’d argue the latter. But I’d love to hear your thoughts.