An intriguing report on German news site Heise.de on Monday unveiled a cheap monitor that includes built-in PC functionality. The Acer DX241H is an otherwise standard 1920x1080 pixel HDMI monitor, but it also features an operating system running on top of an ARM Cortex-A8 chip--the same processor commonly found in cell phones and tablets. All this comes in at around just $400, although that price would likely be significantly lower if the product reaches the United States.
Initial reports said that the "monitor" ran Google's forthcoming Chrome OS, which would have made it the first commercial product to do so. Acer is one of Google's partners so this wasn't infeasible.
However, it transpired that this was a spec-list mistake. Instead, the DX241H will run the Chrome browser, almost certainly on top of an unspecified Linux distro. The only other software, it appears, will be a multimedia player.
Acer is touting the DX241H as a monitor that happens to have browsing built in, just like some monitors have simple computing functionality built in that lets them display pictures on memory cards, for example.
However, the product proposes an intriguing concept: Could it be the first of many inexpensive all-in-one monitor-and-PC combos featuring low-power ARM processors that allow them to run Chrome OS? Could Acer have accidentally created the first true cloud terminal?
The all-in-one PC field has boomed in recent years, pushing aside standard "box-based" desktop computers, despite the fact that more of us are switching to mobile computers and leaving desktop machines behind.
However, the prices of all-in-ones tend to be on the high side; you'll struggle to find one for less than $600, for example. More often than not they're designed to be stylish and compact additions to homes, taking their lead from the Apple iMac. The business market is largely untapped as of yet although I've seen a handful on desks here and there.
IT bosses are perhaps loathe to use all-in-ones because, price notwithstanding, a monitor bonded to a PC means that the whole unit is written off if any item of hardware dies. Additionally, upgrading is much trickier.
However, I'm not sure these complaints would be realistic with a cloud terminal. A cloud desktop computer could be an entirely solid-state machine, and we know from past experience that solid-state technology rarely fails.
Hyper-efficient ARM chips usually don't require fans for cooling, and moving parts are usually the first thing to die inside a computer. There'd be no need for hard disks either; a few gigs of flash chips on the motherboard would be more than enough.
The monitor would last forever--unlike cathode-ray tube (CRT) monitors of old, the TFT panels these PCs use won't degrade over time. They won't go blurry, for example. Their brightness might dim over the space of a few years if they use cold-cathode backlighting, but the trend nowadays is for LED backlighting, which should retain its luminosity for the life of the machine.
As for upgrade possibilities, the reality is that it's rare for businesses to change hardware within computers. It's just not an efficient use of staff time and PCs are treated as commodity items, to be used until they die and then disposed of. Any upgrades are limited to RAM or, even less frequently, hard disks. In many cases all-in-one PCs manufacturers make the RAM easy to upgrade, and if the computer is used as a cloud terminal then running out of storage will not be an issue. Nothing other than the OS is stored locally.
Cloud terminals really would be commodity units that could be swapped in and out as needed, just like mainframe terminals of old. If one died, the tech support operative could get another out of storage and just plug in the same keyboard and mouse. There would be no need to take data off the machines when they reach the end of their lives because the data would be stored in the cloud. Users wouldn't even be aware they were using a new computer because their desktop environment would be stored in the cloud.
Of course, all of this is pure speculation--fantasy, even. Businesses are still wary about working within the cloud, and there's no sign of an end of the security deadlock that's stopping many businesses moving over to the cloud.
But most of all, the history of corporate IT is one of buying PC boxes, and that's not going to change overnight. The IBM PC set a pattern that we've yet to break, despite 30 years having passed.
It's fun to see how lots of different computing technologies are converging at the moment to made cloud computing a possibility, if not a reality. As they said in "The Six Million Dollar Man," we have the technology. However, how we use it is down to how quickly we can abandon our prejudices.
Keir Thomas has been making known his opinion about computing matters since the last century, and more recently has written several best-selling books. You can learn more about him at http://keirthomas.com. His Twitter feed is @keirthomas.
This story, "Did Acer Accidentally Invent the Cloud Workstation?" was originally published by PCWorld.