Get those gigs. The State University of New York at Buffalo’s Mechanical and Aerospace Engineering Department has developed sensors that could boost hard drive capacity by a factor of 1,000—without also driving up price.
As manufacturers develop hard drives designed to cram ever smaller data bits onto disk platters, the bits’ magnetic fields get weaker. That makes it increasingly difficult to detect and reliably read data. More sensitive sensors would allow manufacturers to raise hard drive density without worrying about pushing magnetic fields below readable levels. That’s where SUNY Buffalo’s technology steps in.
Two of the school’s researchers have created a nanoscale magnetic sensor that produces, at room temperature, unusually large electrical resistance changes in the presence of tiny magnetic fields. (Resistance change is the phenomenon that allows a hard drive sensor to discern between “on” and “off” bits stored on magnetic media.) Associate professor Harsh Deep Chopra, who created the sensor with professor Susan Hua, predicts that the new technology could allow disk capacities on the order of terabits (trillions of bits) per square inch, instead of today’s limit of 10 to 30 gigabits in the same space.
Chopra and Hua created the sensor by forming microscopic nickel “whiskers” between two larger electrodes. The sensors take advantage of an effect known as ballistic magnetoresistance, or BMR. The effect occurs when there’s a path that’s so narrow that it allows electrons to shoot straight through without scattering. “Using our manufacturing method, we are able to get a change in resistance of up to 100,000 percent,” compared with today’s technology, says Chopra.
Chopra notes that the sensor’s ability to work at room temperature makes it relatively easy to adapt the technology to work with existing hard drive technologies without significantly adding cost. “We could see supercomputer-size capacity in the size of a wristwatch,” he says.