John Brandon reveals how nanotechnology will change the technology and computing landscape over the next few years.

The classic definition of nanotechnology is any product that uses materials measuring 100 nanometres or less. In fact, modern day innovations typically rely on materials that are much smaller, and will require a platonic shift in fabrication, manufacturing, and engineering processes.

Large computing monoliths such as Intel, IBM and Motorola are already facing the challenges of continually shrinking semiconductors and transistor materials. With the decreased size, new problems arise: silicon causes electrical leakage and processor performance degrades.

The current engineering methods used to reduce leakage also make the chipset drain too much power, too fast. Thankfully, a solution to every problem soon arrives and with them, new developments which will set the stage for a future where the smallest chips get the biggest results.

Microprocessors are the integral component in the computers and devices we buy, but they have reached a point where further innovation seems rather bleak.

While there is no one person that invented the idea of using nanotechnology to spur further innovation, it's true that - in computing and many other fields such as automotive and medicine - it has become a primary technology.

In computing, it is used for new kinds of mass storage devices, batteries, microprocessors, computer interconnects, and displays. It's also prompting further innovation with wireless power, eye implants, and even cloaking devices.

Probe memory

One exciting innovation in nanotechnology is probe memory, a new method of storing data that does not use the traditional magnetic process used for current hard disk drives. Like microprocessor technology, hard disk makers have a challenging proposition: how to make capacities higher, power draw minimal, density smaller, and still make speeds fast enough for rich media, such as video and music.

With a traditional drive, there are two vertical (or, with newer drives, horizontal) magnets that move up and down to move data on metal platters. (An easy way to envision this is by using two fingers that point up or sideways - the space in between forms the magnet that moves the data).

The problem with this method, even though it has lasted for decades as the preferred technology, is that it generates an immense amount of heat. Adding more capacity, now that most of us are using computers for storing rich media, generates even more heat.

Companies such as Seagate and Western Digital will likely never make magnetic hard disk drives thinner than one inch because of the heat and power requirements, and speeds will likely not go beyond the current 10,000RPM, commonplace in servers.

Probe memory uses nanoscale tips arranged in an X or Y pattern that can move in parallel. The tips are positioned in 32 x 32 arrays with one row for writing data and one for reading.

This new memory storage technique uses MEMS (Micro-Electro Mechanical System), which is a method of fabricating silicon at micro sizes. The main benefits are speed, high capacity, and low power draw. Of course, probe memory will have competition from flash-magnetic hybrid drives.

'There are issues with flash memory for data retention and write-ability, so the industry is looking for an eventual replacement,' says David Szabados, product manager at Seagate.

Of course, other memory technologies are on the horizon, such as holographic storage - currently a write-once option for long-term archiving - and phase-change memory - using a chemical process to switch between a crystalline and amorphous state, with many of the same benefits and probe memory.

Eye implants

It was once a common view in medicine that there is no therapy for the common cause of blindness, known as photoreceptor degeneration. At the University of Missouri-Columbia in the US, researchers have developed silicon microphotodiode arrays - currently used in experiments on house cats - to show that a technological solution could restore at least partial vision in humans.

Researchers are primarily using Abyssinian and Persian breeds because those animals have eyes shaped remarkably similar to the human eye, and because photoreceptor degeneration is a relatively common ailment that starts to impair vision just after the cat's birth.

Researchers insert the microchip in the sclera, which is located on the rim of the cat's eyeball, after making two microscopic incisions near the retina. The microchip stimulates vision pathways to the brain by directly interacting with neuron receptors in the eye, and by upregulating the growth factors that protect the dying photoreceptors.

Researchers test retinal function, fundus appearance, and retinal morphology to find out if the cat's vision has improved. So far, the results have been promising: researchers have proven that the microphotodiodes are stimulating electrical activity in the visual pathways, and retinal regeneration has slowed.

'Each of the microphotodiodes act independently like a solar cell - when light hits the diode it produces an electrical current,' says Machelle T. Pardue, from the Atlanta Veterans Administration Centre in Georgia, a co-researcher on the project. 'The diodes react in a gradient fashion, such that more light creates greater levels of current. This electrical activity from the retinal prosthetic may have the added benefit of slowing retinal degeneration while also restoring vision.'

Each implant contains hundreds of thousands of microphotodiodes, which are subatomic particles that react to light to generate an electrical impulse.

It's interesting to note that, while researchers understand how the microchip affects vision, and the implant could be used with humans at some point, the complex process of how the brain transmits and reads images is still a mystery.

Because of that, Pardue explained that it seems unlikely a microchip could ever be inserted into a human eye so that the viewed image could be display on a computer monitor for everyone to see.

Cloaking devices

While it seems unlikely as a practical computing endeavour, the concept of cloaking is one that scientists believe is not only possible, but useful in modern society.

In science fiction, ships in Star Trek can attack another vessel without notice, or a man can wear a cloak and sneak into a party without paying the entry fee. Yet, "cloaking" is more than just obscuring a physical object. It can relate to how wavelengths move around miniscule objects, especially in lasers and medical equipment.

At the Ames Laboratory in Ames, Iowa (US), researchers have already demonstrated how a 2D object can appear invisible and deflect the microwave emissions around it. The concept works the same as the optical illusion that occurs in a river when a rock seems to disappear from a certain vantage point, except that the Ames experiments are at nanoscale sizes.

Metamaterials (which are microscopic materials that change the properties and movement of other materials) channel microwaves around an object at GHz frequencies.

In the future, it may be possible to use the same technique for visible objects, and could aid in medical diagnosis using more powerful metamaterial lasers that can find molecules previously undetectable in humans.

'Metamaterials can be used to make miniaturised devices,' says Costas Soukoulsi, a researcher on the project. 'In principle, they could be manufactured into antennas and waveguides that are 100 times smaller and much lighter than those of today, transforming hardware design in mobile communications, aeronautical systems and other strategic sectors. Metamaterials can also produce materials that are totally non-reflecting over certain frequency ranges, regardless of the angle of incident radiation.'

Soukoulsi explained that conventional lenses have one major hindrance: they are unable to detect the radiation that is finer than the wavelength itself, focusing on far-field components instead of near-field radiation, which decays quickly.

Lenses that use metamaterials could produce much finer resolution. Already, the Imperial College in London is experimenting with how metamaterial lasers could be used to vastly improve the scan ability of Magnetic Resonance Imaging (MRI) lenses.

With the smaller size, new microscopic chips could communicate using nanometre antennas, while using less energy than conventional wireless transmissions. 'Metamaterials could have applications in almost any field that exploits electro-magnetic radiation,' says Soukoulsi.

Read part 2 of this article