08.06.2015 Views

DELL'S GREEN IT VISION & ITS IMPACT ON DATA CENTRES & HPC

DELL'S GREEN IT VISION & ITS IMPACT ON DATA CENTRES & HPC

DELL'S GREEN IT VISION & ITS IMPACT ON DATA CENTRES & HPC

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

DELL’S<br />

<strong>GREEN</strong> <strong>IT</strong> <strong>VISI<strong>ON</strong></strong><br />

&<br />

<strong>IT</strong>S <strong>IMPACT</strong> <strong>ON</strong> <strong>DATA</strong> <strong>CENTRES</strong> & <strong>HPC</strong><br />

JIM HEARNDEN<br />

ENTERPRISE TECHNOLOGIST<br />

DELL C<strong>ON</strong>FIDENTIAL


AGENDA<br />

•Dells Green Direction<br />

•The Issues<br />

•Cooling Solutions<br />

•Free Air Cooling<br />

•Monitoring<br />

DELL C<strong>ON</strong>FIDENTIAL 2


DELL’S COMM<strong>IT</strong>MENT TO<br />

ENVIR<strong>ON</strong>MENTAL RESP<strong>ON</strong>SIBIL<strong>IT</strong>Y<br />

Dell intends to be the<br />

greenest <strong>IT</strong> company on<br />

the planet by the end of<br />

2008<br />

Michael Dell<br />

DELL CEO<br />

DELL C<strong>ON</strong>FIDENTIAL 3


DELL’S COMM<strong>IT</strong>MENT TO<br />

ENVIR<strong>ON</strong>MENTAL RESP<strong>ON</strong>SIBIL<strong>IT</strong>Y<br />

• In November, Dell announced that its<br />

corporate headquarters campus with<br />

17,000 employees is powered with 100<br />

percent ‘green’ energy, in fact its methane<br />

from a waste dump<br />

• We also achieved the target of being<br />

entirely carbon neutral across the globe 5<br />

months early in August 2008<br />

DELL C<strong>ON</strong>FIDENTIAL 4


DELL’S COMM<strong>IT</strong>MENT TO<br />

ENVIR<strong>ON</strong>MENTAL RESP<strong>ON</strong>SIBIL<strong>IT</strong>Y<br />

• The company’s carbon intensity (CO2<br />

emissions/revenue) is among the lowest of<br />

the Fortune 50 and less than half that of its<br />

closest competitor.<br />

DELL C<strong>ON</strong>FIDENTIAL 5


THE REGENERATI<strong>ON</strong><br />

www.regeneration.org<br />

6 of<br />

25


HOW TO GO <strong>GREEN</strong><br />

IN AN <strong>HPC</strong><br />

ENVIR<strong>ON</strong>MENT<br />

And SAVE M<strong>ON</strong>EY!


THE ISSUES


<strong>DATA</strong> CENTER ENERGY<br />

C<strong>ON</strong>SUMPTI<strong>ON</strong> IS ACCELERATING<br />

Billion<br />

kWh / year<br />

140<br />

120<br />

100<br />

80<br />

60<br />

40<br />

20<br />

Projected Data Center Energy Use<br />

Under Five Scenarios<br />

0.8% of total U.S.<br />

electricity use<br />

1.2% of total U.S.<br />

electricity use<br />

2.9% of projected<br />

total U.S. electricity use<br />

Historical<br />

Trends<br />

Current<br />

Efficiency<br />

Trends<br />

Improved<br />

Operation<br />

Best<br />

Practice<br />

0<br />

2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011<br />

State-ofthe-Art<br />

Source: EPA Report to Congress on Server and Data Center Energy Efficiency; August 2, 2007<br />

9


<strong>IT</strong> IS RESP<strong>ON</strong>SIBLE FOR 2% OF THE<br />

WORLD’S CO 2 EMISSI<strong>ON</strong>S<br />

Source: Gartner, Green <strong>IT</strong> – A New Industry Shockwave, December 2007<br />

10


GLOBAL ELECTRIC<strong>IT</strong>Y PRICES<br />

HAVE INCREASED 56% SINCE 2002<br />

Energy Information Administration: http://www.eia.doe.gov/emeu/international/elecprii.html 11


<strong>DATA</strong> <strong>CENTRES</strong>; THE<br />

COSTS<br />

•Cost of running the Data Centre<br />

•An average build PE1950 1U Dual Core server<br />

in a large server farm costs €347 per annum,<br />

just to run the server!<br />

•This equates to 1525kG CO2 Per annum.<br />

Based on 0.527kG CO2 per KWH (UK Gov Figures)<br />

•The Energy Smart PE1950 saves between 12-<br />

16% of this cost. Or around €48 Per annum<br />

12


POWER AND COOLING ARE THE TOP<br />

LIM<strong>IT</strong>ATI<strong>ON</strong>S TO ADDING <strong>DATA</strong><br />

CENTER COMPUTE CAPAC<strong>IT</strong>Y<br />

Sources: IDC Datacenter Power and Cooling Trends June 2007<br />

DELL C<strong>ON</strong>FIDENTIAL 13


POWER AND COOLING ACCOUNT<br />

FOR APPROXIMATELY 60% OF <strong>DATA</strong><br />

CENTER ENERGY C<strong>ON</strong>SUMPTI<strong>ON</strong><br />

Distribution of Power Consumption in a<br />

Typical Data Center<br />

Remake<br />

with smart<br />

art<br />

DELL C<strong>ON</strong>FIDENTIAL Source: Dell PS3 Data Center Study, Fall 2006<br />

14


SERVER ENERGY EFFICIENCY IS<br />

ABOUT ALL OF THE COMP<strong>ON</strong>ENTS<br />

Component-Level Power Consumption<br />

in a Typical Server<br />

DELL C<strong>ON</strong>FIDENTIAL Source: Dell PS3 Data Center Study, Fall 2006<br />

15


THE SOLUTI<strong>ON</strong>S


DIFFERENCES BETWEEN THE <strong>HPC</strong><br />

& THE TYPICAL CORPORATE<br />

“MIXED” <strong>DATA</strong> <strong>CENTRES</strong><br />

<strong>HPC</strong><br />

MIXED DC<br />

•Higher power density, circa 10-25kW<br />

per rack<br />

•Mainly High density racks in the DC,<br />

few less than 7-8kW<br />

•Much higher utilisation rates 80-100%<br />

•Higher rack fill level 80-100%<br />

•Not usually possible to virtualise<br />

•Jobs typically run for many hours,<br />

possibly days.<br />

•Low power density, typically 3-7kW per<br />

rack<br />

•Mix of rack density, some running at 1-<br />

2kW<br />

•Typical utilisation rates of 1-20%<br />

•Rack fill levels typically 40-65%<br />

•Most corporates have embraced<br />

virtualisation to some extent<br />

•Overnight utilisation is rare.<br />

17


<strong>DATA</strong> CENTER INFRASTRUCTURE<br />

<strong>DATA</strong>CENTER ENERGY EFFICIENCY<br />

- OUTSIDE THE BOX<br />

Consumption<br />

Technologies<br />

Server Power Management<br />

•Data Center Automation<br />

Cooling<br />

•Airflow Mgmt<br />

•Liquid Cooling<br />

Server<br />

•Power Edge<br />

•Blades<br />

Storage<br />

•Power Vault<br />

•Power Connect<br />

•EQL<br />

Power Delivery<br />

Subsystems<br />

•UPS<br />

•PDU<br />

•AC/DC – DC/DC<br />

18<br />

of 18 25


MULTIPLIER EFFECT OF TECHNOLOGY<br />

INSIDE-OUT ENERGY EFFICIENCY<br />

Cooling Load<br />

Power<br />

Load <strong>IT</strong> Load<br />

Work<br />

30% 25% 45%<br />

OUTSIDE-IN<br />

Energy Savings<br />

Cooling Load<br />

Power Load<br />

<strong>IT</strong> Load<br />

Work<br />

30% 25% 45%<br />

INSIDE-OUT<br />

Reducing 10 kW of Cooling Load yields:<br />

Reducing 10 kW of <strong>IT</strong> Load yields:<br />

TOTAL DC SAVINGS 10 kW<br />

<strong>IT</strong> Load<br />

Power Distribution<br />

Cooling<br />

TOTAL DC SAVINGS<br />

10 kW<br />

3 kW<br />

7 kW<br />

20 kW<br />

2x<br />

THE INSIDE-OUT APPROACH (REDUCE <strong>IT</strong> LOAD)<br />

DOUBLES ENERGY SAVINGS<br />

DELL C<strong>ON</strong>FIDENTIAL 19


<strong>DATA</strong> CENTER PRODUCTIV<strong>IT</strong>Y<br />

RESULTS FROM <strong>IT</strong> PRODUCTIV<strong>IT</strong>Y &<br />

INFRASTRUCTURE EFFICIENCY<br />

• <strong>IT</strong> Productivity…<br />

– Increases with each generation<br />

– Increases with better utilization<br />

– Grows exponentially as long<br />

as Moore’s law holds<br />

• Best case for Infrastructure Efficiency…<br />

– No power required for cooling<br />

– No losses in power distribution<br />

– Industry will succeed in reaching point<br />

of diminishing returns quickly<br />

• Data Center Productivity…<br />

– Combination of<br />

Infrastructure Efficiency<br />

and <strong>IT</strong> Productivity<br />

DELL C<strong>ON</strong>FIDENTIAL 20


<strong>DATA</strong> <strong>CENTRES</strong>; THE<br />

COST!<br />

•Chilled water is approx. 15% more efficient than DX (Gas)<br />

•Closed Cooling (Cooling rack & contents) is 30% more<br />

efficient than Open Cooling (Room Cooling) (ASHRAE)<br />

•Free Air Cooling can save up to 40% of the overall cooling<br />

costs or 10-12% of the overall DC power draw.<br />

•Gartner estimate that if power prices continue to rise then<br />

by 2010 the cost of the power for the DC will EXCEED<br />

the cost of the hardware<br />

21


<strong>DATA</strong> <strong>CENTRES</strong>; THE<br />

COST!<br />

•The Optimum intake Temperature for the hardware is<br />


SERVER POWER C<strong>ON</strong>SUMPTI<strong>ON</strong> RISES<br />

AS INTAKE TEMPERATURE INCREASES<br />

TYPICAL POWER REQUIREMENTS FOR FANS IN A 1U SERVER<br />

POWER REQUIRED TO RUN INTERNAL FANS<br />

INTAKE TEMP IN DEGREES C<br />

Source: Dell Labs, May 2008<br />

23


OPTIMISE <strong>DATA</strong> CENTER<br />

TEMPERATURE FOR ENERGY SAVINGS<br />

13% due to facility<br />

temperature set<br />

point<br />

2-5% efficiency<br />

increase by picking<br />

different solution<br />

24


BEST PRACTICE: RAISING <strong>DATA</strong><br />

CENTER SET POINT LEADS TO<br />

GREATER EFFICIENCY<br />

1. Chiller cooling efficiency<br />

improves with increased<br />

water temperature<br />

2. Increased water<br />

temperature enables an<br />

increase in temperature<br />

of the air handler coil<br />

3. Incorporating Variable<br />

Frequency Drive blowers<br />

into air handlers enables<br />

greater efficiency<br />

25


AVERAGE 67% GREATER<br />

PERFORMANCE / WATT OVER SIX<br />

GENERATI<strong>ON</strong>S<br />

Source: Dell Labs, May 2008 26


EFFICIENCY DIFFERENCES BETWEEN<br />

AC AND DC TECHNOLOGIES AVERAGE<br />

+/- 3.5%<br />

DELL C<strong>ON</strong>FIDENTIAL Source: Quantitative Analysis of Power Distribution Options, The Green Grid, November 2008 27


COOLING<br />

SOLUTI<strong>ON</strong>S


INFRASTRUCTURE<br />

C<strong>ON</strong>SULTING SERVICES<br />

COMPUTATI<strong>ON</strong>AL FLUID DYNAMIC (CFD) ANALYSIS<br />

• Careful measurement of temperature and air flow as well as<br />

power consumption<br />

• Analysis to determine the best placement of racks, cool or hot<br />

aisles, and vents<br />

• Create a baseline model of your existing power and cooling use


<strong>DATA</strong> <strong>CENTRES</strong>; THE<br />

DETAILS<br />

30


TRAD<strong>IT</strong>I<strong>ON</strong>AL STYLE <strong>DATA</strong> CENTRE<br />

Source: Dell Labs, May 2008 31


POWER AND COOLING ISSUES<br />

ARISE AS HOT AND COLD AIR MIX<br />

3 kW per rack<br />

6 kW per rack<br />

10 kW per rack<br />

32


REDUCE POWER C<strong>ON</strong>SUMPTI<strong>ON</strong><br />

UP TO 15% W<strong>IT</strong>H APC IN-ROW ©<br />

CLOSE-COUPLED COOLING<br />

• Closely couples cooling with the heat<br />

load, preventing exhaust air recirculation<br />

• Ensures equipment temperatures are<br />

constantly held to set point conditions<br />

• Lowers total cost of ownership<br />

• Principal or supplemental cooling<br />

• Interoperable with Dell 42U Racks<br />

33


OVERVIEW - INROW COOLING<br />

AIRFLOW<br />

Heat captured and<br />

rejected to chilled<br />

water<br />

Hot aisle air enters<br />

from rear preventing<br />

mixing<br />

Cold air is supplied to<br />

the cold aisle<br />

InRow RC<br />

APC InfraStruXure InRow RC<br />

Can Operate on hard<br />

floor or raised floor.<br />

34


REDUCE POWER C<strong>ON</strong>SUMPTI<strong>ON</strong><br />

UP TO 30% BY RIGHT-SIZING YOUR<br />

<strong>DATA</strong> CENTER W<strong>IT</strong>H MODULAR<br />

RIGHT SIZED COMP<strong>ON</strong>ENTS<br />

– Modular and scalable power and cooling<br />

– Pay only for what you need - when you need it<br />

– Eliminate waste by avoiding over-sizing and under-utilizing your<br />

data center<br />

35


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S<br />

36


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S


PARTNER COOLING SYSTEMS<br />

IMPROVE <strong>DATA</strong> CENTER THERMALS<br />

38


LIEBERT XD SOLUTI<strong>ON</strong>S<br />

•Up to 30% more<br />

efficient than<br />

traditional perimeter<br />

cooling<br />

Liebert XDV<br />

Liebert XDO<br />

•Practical,<br />

Scalable &<br />

Affordable<br />

39


<strong>DATA</strong> <strong>CENTRES</strong>; ALTERNATIVE<br />

METHODS<br />

WHAT IS<br />

FREE AIR COOLING?


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S<br />

• In Northern Europe for a greater part of the year<br />

the outside air temp is lower than the temp.<br />

required for the Data Centre<br />

• Free Air cooling utilises this lower outside air<br />

Temperature rather than running Chiller plant


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S<br />

• Typically you require a Delta of 7 o C between the<br />

outside temp & the Data Centre Temp.<br />

• Therefore to achieve 25 o C at the racks requires an<br />

outside Temp of


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S


<strong>DATA</strong> <strong>CENTRES</strong>; BETTER<br />

SOLUTI<strong>ON</strong>S<br />

• Most manufacturers of closed coupled cooling<br />

state that they achieve a typical +/-1 o C Temp.<br />

Control across the full height of the Rack.<br />

• Therefore being able to run closer to the 25 o C<br />

intake target & guaranteeing no hot spots means<br />

utilising more free air cooling.


M<strong>ON</strong><strong>IT</strong>ORING


M<strong>ON</strong><strong>IT</strong>OR WHAT IS HAPPENING<br />

Information is power, you cannot make changes blind<br />

•Monitor the incoming power to the data centre<br />

• Monitor the power to the individual racks<br />

• Monitor the power to the individual servers<br />

• Monitor the temperature at rack level<br />

•Monitor the humidity at the individual racks<br />

46 of 25


SERVER POWER M<strong>ON</strong><strong>IT</strong>ORING:<br />

<strong>IT</strong> NECESS<strong>IT</strong>IES AT NO CHARGE<br />

OpenManage TM and <strong>IT</strong><br />

Assistant alerting<br />

• SNMP alerts sent upon<br />

user defined power<br />

thresholds<br />

<strong>IT</strong> Assistant reporting<br />

• Amperage per power<br />

supply<br />

• Energy consumed per<br />

server<br />

• Peak power and<br />

amperage per server<br />

• Aggregate energy<br />

consumed by server<br />

group<br />

Average/max/<br />

Min for the<br />

given System<br />

Chart View of<br />

the collected<br />

data<br />

Aggregate<br />

consumption, with<br />

thresholds alerting<br />

option<br />

View<br />

Chart/Export<br />

data menu<br />

47


APC INFRASTRUXURE PROVIDES A<br />

C<strong>ON</strong>SOLIDATED VIEW OF THE<br />

PHYSICAL INFRASTRUCTURE LAYER<br />

Enterprise Edition<br />

APC InfraStruXure Central<br />

• Real-time status monitoring<br />

or energy consumption<br />

• Instant Critical event<br />

notification<br />

• Improved response time<br />

• Extensible platform grows<br />

with your business<br />

• Advanced Security<br />

Cameras & Monitoring<br />

Devices<br />

• Eliminates the guesswork &<br />

helps improve utilization<br />

48


SO TO SUMMARISE:<br />

Plan Thoroughly<br />

Energy Efficient Servers<br />

Monitor & Analyse<br />

Data Center Best Practices<br />

Advanced Cooling<br />

Technologies<br />

49


ANY<br />

QUESTI<strong>ON</strong>S?


IN-ROW COOLING<br />

EFFICIENCY COMPARIS<strong>ON</strong><br />

Cooling Infrastructure Power Consumption<br />

Cooling<br />

Component IRAH* CRAH* CAHU* Units<br />

AHU Fan Power 20.4 55.0 52.0 kW/Hr<br />

Chilled Water Pump<br />

Power 6.8 6.9 7.0 kW/Hr<br />

Chiller Power<br />

81.9 87.3 86.8 kW/Hr<br />

Condenser Pump<br />

Power 9.3 9.3 9.3 kW/Hr<br />

Cooling Tower<br />

Power 11.0 11.7 11.6 kW/Hr<br />

Total Cooling Power<br />

Consumed 129.5 170.2 166.7 kW/Hr<br />

Efficiency Metric<br />

(equation 1) 0.26 0.34 0.33 kW/kW<br />

Annual Cooling<br />

Operating Cost** 113,442 149,095 146,029 $ USD<br />

COP 5.79 4.41 4.50<br />

51

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!