Low power and communications were dominant themes at the fourth annual Intel Research Day, held at Intel Corp’s Santa Clara, California headquarters.

Chief technology officer Justin Rattner said Intel’s goal was to create during the next four to five years silicon technology that enables 10 times the energy efficiency and 10 times the performance of existing chips.

Energy efficiency underlies just about everything we do these days, Rattner said. Getting that illusion of continuous operation while saving energy is the big challenge here.

Intel, which has been late to the energy-efficiency silicon party, spent the past four years developing its new low-powered architecture, called Core, Rattner noted.

There’s been some sense that has been a very reactive effort, he said. What you see today is not something Intel hashed out in the last eighteen months or two years.

Specifically, Intel is beginning to try to rally industry support for its prototype self-refresh display technology that enables a desktop or laptop computer to power down yet still retain a desktop display.

It works to trick a monitor’s LCD display into retaining an active picture. A machine’s LCD controller requires a new graphic frame 60 times per second in order for it to work. Intel has found a way to capture just one of those frames and copying it to memory.

When the machine powers down, that single frame is then continually sent back to the LCD controller to update it and keep the display image active. This promises to make the computer’s power-down state transparent to users, said Intel researcher James Song.

Without the technology, the machine’s display would look corrupted when in power-down mode, which may, understandably, alarm users.

During the past few months, Intel developed a prototype chip called an FPGA, or field-programmable gate array, to enable this LCD memory action, but this is just a prototype.

Ideally, this would be adopted into display silicon, said Intel senior electrical engineer Jeremy Lees. Intel hopes to license this technology to display makers, he said, but how fast it would get to market depends on market timing and industry cooperation.

Intel also is working to replace wired enterprise infrastructure with distributed virtualization over mesh networks; with a project it calls OverMesh.

Of course, enterprises could use OverMesh in conjunction with wired infrastructure too, but the goal is to be able to scale, tweak and manage infrastructure ad hoc, said Intel platform architect Dilip Krishnaswamy.

Roughly, the way it works is that each enterprise desktop or laptop computer is connected via Virtual Machine Monitor software and can communicate peer to peer and through the distributed network.

Various lightweight overlay networks could then be used for dedicated applications, such as security, VoIP or storage, Krishnaswamy said.

Machines in relative close proximity would become Group Forming Networks, or small sub-networks, in order to help balance the load of the 802.11s mesh network.

The system also differentiates between intra-mesh and extra-mesh traffic in order to manage traffic loads and priorities, he said.

Within three to four years, Intel plans to have some of this networking at the edge technology ready, Krishnaswamy said.

By then, the company plans to integrate multiple reconfigurable radio technologies into silicon, including WiFi, Ultra-Wideband (UWB) and MIMO, or multiple-input multiple-output, antennas, as well as cellular and BlueTooth. WiMax also is an option, he added.

CTO Rattner said by 2009, notebooks might also have 3G, direct video broadcast and GPS radios for a total of as many as 9 different radios in its mobile platform.

There also is a physical challenge of packing so many radios and antennas into an every-shrinking form factor of handheld computers. We need to arrange these radios to work simultaneously or apparently simultaneously … that requires work, Rattner said.

What we’d like to have in four to five yeas is a mostly digital, a digitally enhanced, radio, that supports all or nearly all these [wireless] protocols in a highly efficient fashion and in a small form factor, he said.

Another notable project is Intel’s so-called Corroboration Stealth Worm silicon. Fundamentally, Intel has found a way to embed into silicon a way to find and stop worms that try to cloak themselves in background network traffic.

The company has developed a chipset that detects weak anomalies in network traffic to essentially act as damage control in the case of an attack.

To limit false negatives, which are anomalies that appear to be a stealth worm but end up being nothing nasty at all, each machine in the network would corroborates to determine true worm-like behavior.

The chipset would not replace existing security software, but rather enable infected machines to be automatically quarantined to stop the worm from wiggling throughout the network.

Intel has come up with this nifty technology just in the past year or so, said senior research scientist Eve Schooler. And the algorithms are ready to go.

But just when it will make it to market is uncertain, mostly because just how it would best be implemented in the enterprise is not yet known, she said.

For example, in order to ensure the hardware keeps up with the oft-changing lives of stealth worms, adaptive detectors, perhaps in the form of firmware, would be required. The research team said some of these algorithms might potentially lay in partitions.

And it’s not yet known how an IT administrator would actually implement the technology in the enterprise.

Collaboration is another key research focus for the chipmaker. Gene Meieran, director of manufacturing strategic support in Intel’s tech and manufacturing group reckons collaboration is one of the most important things Intel could work on.

In my view … collaboration is what this whole platform business is about, Meieran said.

The company’s integrated 3D collaboration desktop called Miramar is slated to be implemented in a small trial internally at Intel later this year, said John David Miller, a key contributor on the project.

Intel began work on Miramar in 1997, but it was shelved in 2000 only to picked up by the company last year.