by CIO Staff

Grid Computing…Defined?

News
Apr 01, 20022 mins
Data Center

Defining grid computing is like throwing darts at a dragonfly. That’s because there are as many definitions for grid computing as there are people to ask about it.

For example, some observers say grid computing is just a fancy new name for the well-established concept of distributed computing. “I think they’re pretty much synonymous?the ability to distribute a computational task over some network of connected computers,” says Robert Batchelder, a research director at Gartner in Stamford, Conn. But when pressed, Batchelder gives a subtle distinction: “With grid computing, the computational modules are very similar?that is, a computational task is broken into uniform subdivisions. With a distributed model, they’re irregularly shaped.”

But Ian Foster of the Argonne National Laboratory in Argonne, Ill., coleader of the Globus Project, an open-source grid computing initiative, says distributed computing is a scaled-down subset of grid computing. He defines distributed computing as harnessing desktop PCs within an enterprise to make a single resource, while grid computing involves diverse hardware that can extend beyond the enterprise.

Observers do agree on one thing: Grid computing involves disparate resources working on parts of one big computing problem. And if you’re a CIO with that big problem, that should be enough to at least get you looking at grids as a solution.