Over the holiday break, I scored a 1 Tb Fantom External Drive for the post-rebate price of $89! After I attached it—replacing a 160 Gb external drive, thus illustrating the truism that more capacity begets more need—I began thinking about the role of storage in the home.
It’s long past obvious that homes today commonly contain a lot of computing horsepower accompanied by significant storage requirements, as well as significant storage management issues. With the rise of digital devices like cameras and PMPs, lots of personally important data is strewn about different machines. In fact, it wouldn’t be far off the mark to say that many homes now resemble mini-data centers, and pose the same type of problems that corporate data centers face, albeit at smaller overall volumes. That’s balanced, on the other hand, by the fact that much of the data in home data centers is far more emotionally important to its owners. While secure practices regarding your company’s data may be important to you, professionally speaking, it is nowhere near as personally important as the footage of your child taking his or her first steps.
This presents you with a dilemma regarding your storage practices. Of course, each machine comes with its own disk—and probably a pretty high capacity one at that. Today’s machines typically ship with 300 to 500 GB of disk storage. Every machine in the house having a bunch of storage isn’t ideal, however (we have around six or seven spread through different rooms; I’m never quite sure, as it seems machines come and go). And in any case, I am leery of putting important data on the default disk layout of a Windows box, since when you get around to (the seemingly inevitable task of) reinstalling and reformatting the machine, your data can be wiped out in an instant.
You can repartition, of course, to place important data on a different partition on the internal disk, thus avoiding this issue, but then you’ve got data on a drive secreted away in the machine. With the space tolerances of many boxes these days, if something happens to the disk, you’ll never be able to get at it. Consequently, I don’t like using internal drives because of data robustness issues. In any case, even if you are willing to live with internal drives, you still are faced with the issue of having important data dispersed on different machines, posing problems of keeping track of it and backing it up. Hmmm. This is reminiscent of the issue with direct-attached storage within the data center, so my analogy of the home data center is, perhaps, not as far-fetched as it might seem.
That brings us to the next alternative: an externally-attached disk. This definitely gets around the issue of data being stuck on a disk inside the machine. However, the external disk is still tethered to a single machine. It is possible to share folders within the disk across the network, allowing other machines to use the external disk as additional off-machine storage, but I often find Microsoft networking is maddeningly inconsistent. Despite following all the recommended practices, remote folders appear and disappear in what appears to be a rather random fashion, making this solution unreliable. Furthermore, if you do want to use this external storage arrangement to back up machines, you have to set up the backup processes yourself. Some external drives do come with backup software and an easy-to-administer mechanism, but they carry a price premium compared to the bare external drives—more of a premium than (to my mind) can be warranted by including a modestly-featured backup product along with a basic external drive.
So, here’s a common situation: multiple machines, islands of data, no backup. The backup problem can be solved through the use of one of the online backup services like Mozy or countless other offerings. One problem with applying this solution to the situation: the backup offerings charge a monthly fee per machine—even at $5/month, six or seven machines can add up to a decent-sized tab. And then you’re faced with trying to keep all of the separate backup accounts straight, not to mention trying to keep track of what data is in which backup account. You’ve taken an N-problem and converted it to an N-squared one. Not very attractive.
The answer for your home data center problem is obvious: you need a single storage device that each machine, whether Windows, Mac, or Linux, can use. In short, you need NAS. This solves the islands of storage problem. It also solves the not-every-platform problem. But what to use for your home data center NAS?
You can build your own. There are a bunch of Linux-based open source offerings around. But that makes you a Linux system administrator, probably not what you wanted. You want something simple.
Major vendors provide NAS devices, often called something else. HP, for example, just announced the MediaSmart Server, a product that provides centralized storage and helps with another aspect of today’s home data center: sharing media among computers and other media devices (e.g., your TV). However, this product is based on Windows, giving you yet another Windows machine to manage, one with all of your important data, to boot. I’m not sure that’s where I want to go.
Buffalo, Netgear, and others offer Linux-based NAS devices that insulate you from the down-and-dirty system administration tasks associated with the Linux roll-your-own solutions. This gets around the stability issue. But you’re still faced with two problems: (1) you’ve got centralized storage, but it’s still in your home data center, so you have no disaster recovery protection; and (2) home NAS devices are significantly more expensive than typical external storage. I’m not sure I understand why — after all, consumer wireless routers are also special purpose Linux devices, but they’re dirt cheap. Why should marrying an external drive with a custom Linux distro cost five times what a wireless router costs?
So here’s my modest proposal for an offering to solve all of these problems: a NAS device that is delivered at a low price point, with a monthly fee built in to provide remote (e.g., Mozy-like) backup bundled in. The vendor would buy down the initial price based on the monthly fees provided for in the contract. In essence, this product would replicate the typical mobile phone arrangement—and we’ve seen how this has powered mobile phone adoption.
This product would solve the isolated pools of storage issue: each home computer could use the NAS device either as primary storage or could back up local storage to the central storage server. It would solve the issue of central storage not providing offsite backup. It would come with a low enough initial price point to make it an attractive alternative to potential users, thereby leveraging price elasticity to increase volume; this would enable the vendor to increase market share, perhaps enough to dominate the segment. Moreover, once someone is tied into a particular system, they’re likely to stay loyal. I mean, once you’ve got a secure place for your vital personal data, you’re not likely to move it somewhere else: the switching cost would be prohibitive. And from the point of view of the vendor, this long-term commitment through high switching costs would be a bonanza. In effect, the machine would represent an alluring entrance into an extended annuity subscription stream.
So there you have my modest proposal to solve the home data center problem—a problem that is only to become more acute in the future. It’s a problem waiting for the right creative solution.
Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of “Virtualization for Dummies,” the best-selling book on virtualization to date.