Ctg sit23 hub banner.jpg

Intel ramps up environmental efforts

But involvement in sustainable energy projects hard to balance against increasing data centre server and supercomputer power consumption

Chip-maker Intel, whose components power so many of the PCs, servers and other electronic devices attributed to the global increase in electricity consumption over the last decade, showcased a series of environmentally friendly initiatives at its Dublin manufacturing plant last week.

It also announced it would shortly be opening a new Energy and Sustainability Lab within the facility.

Current research and development projects sponsored by the company include power management and monitoring applications for the home and office, and plug-in wireless sensors that map specific appliances to energy usage patterns and transmit data to in-home displays.

Intel is also working with various sustainable energy suppliers to build IT-centric management, control and monitoring systems for environmentally focused applications, including electric vehicle charging and high-density networks of low-cost sensors able to monitor the world's oceans and predict the weather.

Dr Eddie O'Connor is CEO at Mainstream Renewable Power, an Irish company that builds offshore wind farms in various parts of the world, which helped formulate initial proposals for the European Offshore Supergrid, a high-voltage direct current (HDVC) electricity distribution network, in 2001, to which Intel is affiliated along with 32 other companies, including engineering giant Siemens.

"The new grid is festooned with IT. The power is made at sea, but it has to flow to where people are, so we have to have lots of data collected in different places to help us identify where the wind is blowing and route the power to where demand is highest, which will effect electricity price," he said. "Supercomputers are needed, and Intel has the type of industry foresight, chip design capabilities and architecture, and all the electronics needed to service this industry."

To their credit, Intel and other CPU manufacturers, including AMD and IBM, have made significant advances in reducing processor power consumption in recent years, primarily to improve the battery life in portable devices like laptops and smartphones, but also to cut the power footprint of data centre servers.

The motivation is not entirely altruistic – data centre expansion has accelerated at such a rate that many facilities are selectively located according to the availability of local electricity supplies with lower-power chips needed to increase rack density and available processing power, while reducing, or at least not increasing, heat output to cut cooling costs.

Ongoing efforts by chip makers and electronics manufacturers to make computers and other devices progressively smaller and less power hungry may alleviate pressure on the world's available electricity suppliers to some extent, though even that seems unlikely if people simply end up using more devices. A bigger threat to sustainability could come from the fact that the more devices there are, the more back-end data centre infrastructure is required to store and process the information they generate.

Steve Pawlowski, chief technology officer (CTO) at Intel's datacentre and connected systems group and senior fellow, said computers could have shrunk to the size of a keyring within the next ten years for example, while systems will have become more context aware so that, instead of storing every bit of data generated, they will be intelligent enough to "pull out only the stuff we want and throw the rest away", which may at least mean data centres need less storage capacity.

But the biggest drain on power is likely to come from exascale supercomputers which, said Pawlowski, will be needed to prevent some areas of scientific research from stalling completely.

Exascale systems, expected by 2018, will deliver 10 [to the power 18] floating point operations per second, around 1,000 times more processing power than the petascale supercomputers being built today.

"We need Exascale computing to solve crucial problems. Scientists are saying there is no way we can model a neuron cell, or a human brain or a full aircraft simulation model [using current systems]," said Pawlowski. "The first exaflop machine will be in the order of 52 megawatts [of power], then to get the thousand times performance increase it will be a 120-megawatt machine, which is much higher than anything available in the US today. We have to build a machine that is no more than 20 megawatt."

To put that into perspective, Global Switch's London 2 data centre built to a Tier III specification currently provides 45 megawatts of power and cooling for its entire infrastructure.

Whether scientific and research communities see all the computing power as essential or merely nice to have is a moot question.

Pawlowski highlighted specific applications which would benefit from exascale systems, including genomics research and weather prediction forecasting. In addition, high-performance computing (HPC) will improve the efficacy of cancer research and treatment – via DNA mapping and therapy sequencing – tenfold within the next decade.

Professor Henry Markham, leader of the EU-funded Blue Brain project, also recognises the potential benefits of these vast systems. Markham is currently building a system which aims to simulate the human brain in order to better understand how it works and what goes wrong.

Today's supercomputers are only capable of building simulations of around 10,000 neurons – building a full model of all the 100 billion neurons in the human brain and mapping will require much more power.

"That is an exascale problem that will push supercomputing to its extremes," he said. "The brain today uses 30 watts of power; we do not know how this works, but it could give us important insights into the efficient use of technology."

You may also like

National Grid is turning analogue to digital - Ctrl Alt Lead podcast
/podcasts/4333508/national-grid-analogue-digital-ctrl-alt-lead-podcast

Public Sector

National Grid is turning analogue to digital - Ctrl Alt Lead podcast

'We can't do what we've always done, just more efficiently'

AI to blame for Google's rocketing greenhouse gas emissions
/news/4331149/ai-blame-googles-rocketing-greenhouse-gas-emissions

Green

AI to blame for Google's rocketing greenhouse gas emissions

Casts doubt on search giant's 'Net Zero by 2030' goal

Decoding the bat signal: How machine learning is helping conserve bats and their habitats
/feature/4267843/decoding-bat-signal-machine-learning-helping-conserve-bats-habitats

Artificial Intelligence

Decoding the bat signal: How machine learning is helping conserve bats and their habitats

Acoustic data that once took years to decipher can now be analysed in days