06.07.2013 Views

2011 - Cooperative Institute for Research in Environmental Sciences ...

2011 - Cooperative Institute for Research in Environmental Sciences ...

2011 - Cooperative Institute for Research in Environmental Sciences ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

The SEAESRT (Space, <strong>Environmental</strong> Expert System Real<br />

Time) algorithm to specify satellite anomaly hazards was<br />

developed and successfully tested be<strong>for</strong>e it was provided<br />

to the Space Weather Prediction Center (SWPC). It has not<br />

yet been put <strong>in</strong>to operational test mode by SWPC because<br />

of other priorities. The National Geophysical Data Center<br />

(NGDC) may implement it <strong>in</strong>to their system as a postanalysis<br />

tool.<br />

<strong>Research</strong>, development and test<strong>in</strong>g of five algorithms <strong>for</strong><br />

Phase 2 of the GOES-R development project were completed<br />

and delivered to SWPC. These algorithms will be<br />

part of the suite of Level 2+ process<strong>in</strong>g of GOES-R data<br />

to provide the <strong>for</strong>ecast center with specification, analysis<br />

and <strong>for</strong>ecast tools <strong>for</strong> the space environment. They <strong>in</strong>clude<br />

the magnetometer analysis tool that <strong>in</strong>dicates when a<br />

magnetopause cross<strong>in</strong>g has been detected at geostationary<br />

altitude by the GOES-R satellite. Two more of the algorithms<br />

were developed to use the energetic particle data to<br />

determ<strong>in</strong>e temperature and density moments and <strong>in</strong>dicate<br />

spacecraft charg<strong>in</strong>g levels. The f<strong>in</strong>al two algorithms<br />

analyze the solar ultra-violet imag<strong>in</strong>g data to produce<br />

thematic maps and coronal hole images of the sun.<br />

Plann<strong>in</strong>g and requirements gather<strong>in</strong>g <strong>for</strong> Phase 3 of the<br />

algorithms development project are underway.<br />

Product: Loto’aniu, TM, and HJ S<strong>in</strong>ger (<strong>2011</strong>), The GOES-<br />

R magnetopause cross<strong>in</strong>g algorithm theoretical basis document,<br />

NOAA/NESDIS Technical Publication.<br />

Loto’aniu, TM, L Mayer, and M Berguson (<strong>2011</strong>), The<br />

GOES-R magnetopause cross<strong>in</strong>g algorithm implementation<br />

and users’ guide, NOAA/NESDIS Technical Publication.<br />

Loto’aniu, TM, L Mayer, and M Berguson (<strong>2011</strong>), The<br />

GOES-R delivery 2 test plan and results, NOAA/NESDIS<br />

Technical Publication.<br />

Rodriguez, J (<strong>2011</strong>), GOES-R SEISS density and temperature<br />

moments and level of spacecraft charg<strong>in</strong>g algorithm<br />

theoretical basis document, NOAA/NEDIS Technical Document.<br />

Rodriguez, J (<strong>2011</strong>), GOES-R SEISS density and temperature<br />

moments and level of spacecraft charg<strong>in</strong>g algorithm<br />

implementation and users’ guide, NOAA/NEDIS Technical<br />

Document.<br />

Rodriguez, J (<strong>2011</strong>,) GOES-R SEISS density and temperature<br />

moments and level of spacecraft charg<strong>in</strong>g algorithm<br />

test plan and results, NOAA/NEDIS Technical Document.<br />

Rigler, EJ, and SM Hill (<strong>2011</strong>), SUVI thematic maps algorithm<br />

theoretical basis document, NOAA/NEDIS Technical<br />

Document.<br />

GSD-07 High Per<strong>for</strong>mance<br />

Comput<strong>in</strong>g Systems<br />

FEDERAL LEAD: SCOTT NAHMAN<br />

CIRES LEAD: CRAIG TIERNEY<br />

NOAA Goals 2: Climate<br />

Project Goal: Provide systems research support <strong>for</strong> high-per<strong>for</strong>mance<br />

comput<strong>in</strong>g (HPC) ef<strong>for</strong>ts and assistance to the user<br />

community; provide HPC Systems communications equipment<br />

and software research; and provide research support <strong>for</strong> highper<strong>for</strong>mance<br />

file systems.<br />

92 CIRES Annual Report <strong>2011</strong><br />

Milestone 1. Conduct technical study of latest hardware<br />

architectures to support future NOAA procurements.<br />

NOAA has <strong>in</strong>creased <strong>in</strong>vestment <strong>in</strong>to new high-per<strong>for</strong>mance<br />

comput<strong>in</strong>g (HPC) systems over the last year <strong>for</strong><br />

support<strong>in</strong>g both weather and hurricane model<strong>in</strong>g. While<br />

<strong>in</strong>vestment has <strong>in</strong>creased, the research needs cont<strong>in</strong>ue to<br />

outstrip resources. CIRES supported NOAA <strong>in</strong> the identification<br />

and acquisition of two major HPC plat<strong>for</strong>ms. Our<br />

goal <strong>in</strong> each case was to evaluate exist<strong>in</strong>g technologies and<br />

lend our own expertise to the architecture and implementation<br />

of these systems to maximize NOAA’s <strong>in</strong>vestment. The<br />

first system was delivered at the end of summer 2010. This<br />

system, tJet, was a three-times per<strong>for</strong>mance <strong>in</strong>crease over<br />

the previous system. This made the Earth System <strong>Research</strong><br />

Laboratory (ESRL) the home to the largest NOAA-managed<br />

HPC resources.<br />

The second system procured is a system whose purpose<br />

is to support weather and other model<strong>in</strong>g ef<strong>for</strong>ts <strong>for</strong> the<br />

entire NOAA program. The system is to be located <strong>in</strong> Fairmont,<br />

W. Va. CIRES provided assistance <strong>in</strong> the gather<strong>in</strong>g<br />

of requirements, and made architectural recommendations<br />

to NOAA to help them maximize their <strong>in</strong>vestment. This<br />

system shall be operational by the end of <strong>2011</strong>.<br />

Milestone 2. Investigate tools to automate the use of<br />

Graphical Processor Units (GPU) co-processors with<strong>in</strong> exist<strong>in</strong>g<br />

GSD codes.<br />

While the use of HPC cont<strong>in</strong>ues to <strong>in</strong>crease <strong>in</strong> importance<br />

and expand to other discipl<strong>in</strong>es, the users of this technology<br />

are start<strong>in</strong>g to run <strong>in</strong>to a per<strong>for</strong>mance wall. While the<br />

number of cores per central process<strong>in</strong>g unit (CPU) socket<br />

cont<strong>in</strong>ues to <strong>in</strong>crease, the per<strong>for</strong>mance of <strong>in</strong>dividual cores<br />

is not <strong>in</strong>creas<strong>in</strong>g. Also, the power consumed on these<br />

systems is becom<strong>in</strong>g a significant portion of the total cost of<br />

ownership of HPC systems. NOAA’s Global Systems Division<br />

(GSD) is look<strong>in</strong>g at new hardware technologies that<br />

can provide significant per<strong>for</strong>mance improvements while<br />

decreas<strong>in</strong>g power consumption. Graphical Processor Units<br />

(GPUs) are one technology that appears to have the ability<br />

to meet this need, but effective use of these devices requires<br />

significant changes <strong>in</strong> programm<strong>in</strong>g models.<br />

We have been support<strong>in</strong>g NOAA <strong>in</strong> the <strong>in</strong>vestigation of<br />

commercial and open-source tools that can help automate<br />

conversion of exist<strong>in</strong>g code to efficient code on the GPUs.<br />

Several vendors (Portland Group, Intel, Cray, CAPS) are<br />

tak<strong>in</strong>g different approaches to this problem. While most<br />

of the technologies are still immature, they are all show<strong>in</strong>g<br />

promise. Comparisons of code generated by vendors’ tools<br />

to handwritten code show tools are between 50 percent and<br />

90 percent as efficient as handwritten code. Much work is<br />

still needed to assist the vendors with their programm<strong>in</strong>g<br />

tools to <strong>in</strong>crease relative per<strong>for</strong>mance as close to 100 percent<br />

as possible.<br />

Milestone 3. Support <strong>in</strong>vestigations of large, core-count<br />

model scalability <strong>in</strong> heterogeneous comput<strong>in</strong>g environments.<br />

Due to time restra<strong>in</strong>ts and changes <strong>in</strong> NOAA priorities <strong>for</strong><br />

the high-per<strong>for</strong>mance comput<strong>in</strong>g acquisition research area,<br />

we were not able to address this Milestone.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!