Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Earth Science Technology

NOAA Goes Live With New Forecasting Supercomputers 53

dcblogs writes "The National Oceanic and Atmospheric Administration (NOAA) Thursday switched on two new supercomputers that are expected to improve weather forecasting. The supercomputers are each 213 teraflops systems, running a Linux operating system on Intel processors. The U.S. is paying about $20 million a year to operate the leased systems. The NWS has a new hurricane model, Hurricane Weather Research and Forecasting (HWRF), which is 15% more accurate in day five of a forecast both for forecast track and intensity. That model is now operational and running on the new systems. In nine month, NWS expects to improve the resolution of the system from 27 kilometers to 13 kilometers. The European system, credited with doing a better job at predicting Sandy's path, is at 16 kilometers resolution. In June, the European forecasting agency said it had a deal to buy Cray systems capable of petascale performance."
This discussion has been archived. No new comments can be posted.

NOAA Goes Live With New Forecasting Supercomputers

Comments Filter:
  • by Overzeetop ( 214511 ) on Friday July 26, 2013 @07:11AM (#44389807) Journal

    I suspect if they removed the computers and installed windows in the offices of our local TV meteorologists we would get better short term forecasts. I've also decided that any precipitation forecast more than about 3-4 days out that doesn't involve a system as large as a hurricane is just a wild-ass guess*. Heck, even real time they're often wrong, the local guys are fond of reporting sunny all day while I'm actually looking outside at it raining.

    *Well, unless you're in SoCal Mar-Dec, in which case "Sunny" is always the statistically correct answer, or Orlando/Daytona, where "It will rain at 3:45pm for 5 minutes" is always the statistically correct answer.

    • by Bud Light Lime ( 2796025 ) on Friday July 26, 2013 @07:23AM (#44389847)
      Larger atmospheric features such as air masses and mid-latitude cyclones are more predictable than smaller features. Thunderstorms are much smaller and less predictable. Also, thunderstorms are driven by instability in the atmosphere. That is, if air is nudged upward, it will accelerate upwards. This occurs when warm (or hot) moist air is beneath cold air aloft. If there's a lot of cloud cover left over from thunderstorms the previous day, for example, that makes predicting thunderstorm chances the next day much more difficult. Predicting the behavior of large air masses is done with much more skill than smaller features such as thunderstorms.
      • by Anonymous Coward
        in other words, it seems straightforward to track a feature that already exists, and while tropical storms can last days to weeks, many thunderstorms can appear and dissipate within hours. Figuring out if the conditions are favorable for storm formation is a lot different than figuring out exactly where it will form.
      • Now they just need to lob a couple of satellites up in the sky to gather the data for this computer, and we'll be set.

    • by 93 Escort Wagon ( 326346 ) on Friday July 26, 2013 @08:35AM (#44390297)

      In the US anyway, TV weather persons don't do their own forecasts - they rely on their local National Weather Service reports. And those are based on actual meteorologists looking at what the US's GFS says, as well as the European ECMWF, and combining it with their own experience regarding what the models tend to get right/wrong locally.

      UW's professor Cliff Mass has written, many times, about the problems with US weather prediction [blogspot.com]. The computers they rely on are old and less powerful than the European ones - plus it's exacerbated by the US NWS having a broader mandate, so the computers they do have are used to run several other types of simulations in addition to the standard weather models.

      This purchase is definitely good news.

    • Why not just hire Al Sleet?
      "Tonight's forecast: Dark. Continued dark throughout most of the evening, with some widely scattered light towards morning."

    • by PPH ( 736903 )
      Or Chrome. There's a good weather app for Chrome.
    • Do you know what the difference is between climate and weather? Climate is what you expect, weather is what you get.
    • The weather guys around here are 60% right 90% of the time
  • by Bud Light Lime ( 2796025 ) on Friday July 26, 2013 @07:16AM (#44389821)
    HWRF runs at a much finer grid spacing than 27 or 13 kilometers. As I recall, the grid spacing is around 3 km in the inner nest. This is done to explicitly simulate the convection at the inner core of a tropical cyclone. This nest moves with the storm, and is embedded within a much larger domain. The upgrade from 27 to 13 kilometers actually refers to the GFS model. It's a spectral model that has a global domain. Other models that are regional (including the outer domain of the HWRF) need to know the conditions at their lateral boundaries, so they know what's moving into the domain. In the US, they typically use the GFS for their boundary conditions. I'm actually very skeptical of the need for upgrading the resolution of the GFS. That may have a role in improving GFS forecasts, but there have been studies showing that the initial conditions of the GFS are the real problem. The atmosphere is a chaotic system; that is, two similar initial states will diverge over time to produce two very different outcomes. In a study where the GFS was initialized with ECMWF initial conditions, the performance of the GFS improved. Hurricanes are typically steered by large scale features, which aren't necessarily going to be simulated better by using a finer resolution. It also doesn't address the initial conditions problem. I'm in favor of throwing more computing power at meteorology, but I'm not convinced it will solve the problems with the GFS.
    • by hwrfboy ( 2997951 ) on Friday July 26, 2013 @08:26AM (#44390203)

      I'm actually an HWRF developer and you are correct that the summary was wrong. Our innermost domain is 3km, at a size of around 600x600km, intended to resolve the storm's inner core region (the area with the dangerous winds and, typically, largest rainfall). It is within a larger 1100x1100km 9km resolution domain, for resolving the nearby environment, and there is a gigantic 7500x7500 km, 27km resolution domain to resolve large-scale systems that drive the track. Also, the 3km resolution is not just needed to resolve convection: you need it to resolve some of the processes involved in intensity change, and in concentration of the wind maximum, such as double eyewalls, mesovortices, hot towers, and vorticity sheets. The GFS is our boundary condition, and part of our initial condition. We tried using ECMWF instead as an experiment, but that causes mixed results on track, and worse intensity. The intensity issues are likely due to their model's lack of skill at intensity prediction and primitive ocean model. (GFS has better hurricane intensity than ECMWF, despite having lower resolution!) ECMWF also has completely different physics and dynamics that ours, which results in larger shocks at the boundary.

      You can see a better description of our model on our website:

      http://www.emc.ncep.noaa.gov/index.php?branch=HWRF [noaa.gov]

      and if you're interested in running HWRF yourself, you can do that too, though it will be another week or two before the new 2013 version is publically available. HWRF is an open source model, put out by the NOAA Developmental Testbed Center (DTC), which handles the public distribution and community support. (Support of HWRF installations in other countries' forecast centers is generally handled through the NOAA Environmental Modeling Center (EMC).) Here is the webpage for user support and downloads:

      http://www.dtcenter.org/HurrWRF/users/overview/hwrf_overview.php [dtcenter.org]

      As for your point about improved resolution not helping the GFS, that's not true, especially in the case of hurricanes. The resolution of the GFS (~27km) is so low that it cannot even resolve the structure of most storms, let alone see the complex features involved in predicting intensity, rainfall or the finer points of track. When it can resolve the storm, such as with Superstorm Sandy, it has intensity skill competitive with regional models. The upcoming GFS upgrades to 13km and later 9km resolution (~2-4 years away) will allow the model to get a good idea of the basic structure of the storm, and start having real skill at predicting intensity, even for smaller storms. That, in turn, will help the HWRF and GFDL regional hurricane models improve their track and intensity prediction since they both rely on GFS for initial and boundary conditions.

      • Re: (Score:3, Interesting)

        Your point about the GFS is well-taken, but at present, I'd never use the GFS or ECMWF to forecast hurricane intensity. Yes, there's value in increasing the resolution of the model. But there's also a need to improve the data assimilation to produce better initial conditions (see this NOAA white paper [noaa.gov]). The conclusion was that there are gains to be made by bumping up the resolution, but that's only one of the recommended approaches to improving the GFS. Others included better data assimilation and improving

        • Your point about the GFS is well-taken, but at present, I'd never use the GFS or ECMWF to forecast hurricane intensity. Yes, there's value in increasing the resolution of the model. But there's also a need to improve the data assimilation to produce better initial conditions (see this NOAA white paper [noaa.gov]). The conclusion was that there are gains to be made by bumping up the resolution, but that's only one of the recommended approaches to improving the GFS. Others included better data assimilation and improving parameterizations. Much of what the public hears has been focused on the resolution of the model. Yes, it does matter, but there are other considerations that are at least equally important.

          Much of the criticism of the GFS with respect to Sandy has focused on the track forecast several days out. While increasing the resolution of the model could provide some improvement to the track forecasts, I would expect better initialization to have a larger role, especially at that forecast range. I'd believe bumping up the resolution would provide much better gains in the area of forecasting intensity.

          Actually, I agree that the main problem is the initial state, and the experiences with HWRF upgrades back that hypothesis. Contrary to what the (completely wrong) article summary states, there was absolutely no HWRF resolution upgrade this year. The upgrades were all data assimilation, model physics and model dynamics. (Everything except resolution and ocean.) Our upgrade to 3km resolution was last year, in May 2012. While last year's resolution upgrade did help, it had nowhere near the impact of this

  • by Anonymous Coward

    When the machines take over, they'll be led by a core of weather-predicting super computers.

  • by account_deleted ( 4530225 ) on Friday July 26, 2013 @07:41AM (#44389905)
    Comment removed based on user account deletion
  • Is their raw data available? I'd like a crack at it with my desktop computer.

    • Re:Beaches (Score:5, Informative)

      by hwrfboy ( 2997951 ) on Friday July 26, 2013 @08:43AM (#44390355)

      The summary is confusing two different models: HWRF and GFS. The HWRF model is a public model you can download and run, as long as you have ~20 GB of RAM free on your computer:

      http://www.dtcenter.org/HurrWRF/users/overview/hwrf_overview.php [dtcenter.org]

      There is a public version of the GFS, but I'm not sure where. I'm mainly an HWRF developer.

      Also, you can download GFS and HWRF forecasts in real-time (ie.: files less than 10 minutes after they're created by the operational NCEP WCOSS supercomputer) here:

      GFS: ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.*/ [noaa.gov]

      You want the files named gfs.t??z.pgrb2f* - those are the forecast files every 1-6 hours at 0.5 degree resolution.

      The HWRF real-time data is here:

      HWRF: ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/hur/prod/hwrf.*/ [noaa.gov]

      The *.hwrfprs_* files contain model fields. The *prs_n* are the 3km domain, prs_m are combined 9&3km, prs_p are 27km, prs_i is 9km and prs_c are combined 27:9:3km. The track files are *.atcfunix for six-hourly, *.3hourly for three-hourly and *.htcf for experimental per-timestep (5 second) information.

      You can also get archived track files from a three season retrospective test of the GFS and various HWRF configurations here:

      http://www.emc.ncep.noaa.gov/HWRF/tracks/ [noaa.gov]

      Formats of the track files contained within are described well on JTWC's website (the equivalent of the NHC for everything not near mainland US):

      http://www.usno.navy.mil/NOOC/nmfc-ph/RSS/jtwc/best_tracks/

    • Yes, I believe you _can_ get a lot of the raw data via NOAA's site, it is I believe free for US citizens since we already paid for it through funding NOAA. Maybe you can use Google's new algorithm for detecting 100,000 features in an image on a single computer to make the model run on you desktop in usable time - remember, as one of the NOAA sites quotes a scientist: if your model doesn't run in faster than real time, you might as well just swivel your chair to look out the window (assuming I'm not being
  • by account_deleted ( 4530225 ) on Friday July 26, 2013 @07:52AM (#44389957)
    Comment removed based on user account deletion
    • "The system uses so much power that its emissions directly influence the weather on all continents and mars."

      The best way to predict the weather is to control it, obviously.

      • Re:power (Score:4, Informative)

        by hwrfboy ( 2997951 ) on Friday July 26, 2013 @09:56AM (#44391051)

        The best way to predict the weather is to control it, obviously.

        Actually, that was attempted, and aborted due to diplomatic reasons. NOAA tried cloud seeding experiments in the 1960s-1970s attempting to weaken or destroy a tropical cyclone when it is out to sea. Unfortunately, the experiment usually failed, and occasionally the surviving hurricanes made landfall and did significant damage. When that happened, some countries suspected that the US was doing this secretly to develop weather weapons, so the project was shut down in the early 1980s to avoid the resulting public outcry and diplomatic incidents. Why should congress keep funding a failed experiment that causes diplomatic problems? You can read about this here:

        http://www.aoml.noaa.gov/hrd/hrd_sub/stormfury_era.html [noaa.gov]

        and Wikipedia has a good page with a lot more information:

        http://en.wikipedia.org/wiki/Project_Stormfury [wikipedia.org]

        On a positive note, the project contributed to the formation of the present-day AOML Hurricane Research Division, which now has the invaluable Hurricane Hunter aircraft, as well as some hurricane modeling experts. They contribued a lot in the past few years to callibrating the HWRF model physics and dynamics to observations.

  • by Anonymous Coward on Friday July 26, 2013 @08:36AM (#44390307)

    Please use American measurements, like rods and furlongs for distance.

    • Please use American measurements, like rods and furlongs for distance.

      You will be happy to hear that we do. The official track forecast products for the US, including HWRF and GFS, put out intensity in knots and speed in nautical miles. The only SI units are the pressure, for which we use millibars. This format predates most of our developers, and originates back in the days of punch card machines. In fact, a set of hurricane track files is still known as a "deck file" (referring to decks of punch cards), with the numerical guidance being the "A Deck" and the best track (

  • Supercomputers need data. The forecast for US weather satellites is partly cloudy.

    Turbulence Ahead for Weather Satellites [nationalgeographic.com]

    JPSS [noaa.gov]

  • This won't last (Score:1, Offtopic)

    by ulatekh ( 775985 )

    These weather-predicting supercomputers will be shut down by the politicians as soon as they calculate that climate change has nothing to do with human activity, and everything to do with that massive hydrogen-fusion reactor only 93 million miles away.

  • someone is making a killing, I think. the purchase cost of these computers should be under $30M total, and less than $3M/year to run.

    • Re: (Score:3, Informative)

      by hwrfboy ( 2997951 )

      Actually, the high cost per year is because there are several stages of planned upgrades, intended to support the steady increase in resolution and data assimilation capacity of the various models. (Including a massive GFS upgrade next year.) The project, from the NCEP side at least, was completed five weeks early and under budget. The estimated savings, from shutting down the old overpriced Power6/AIX CCS cluster early, is about $1 million, and the switch to Intel/Linux will save taxpayer dollars in the

Never test for an error condition you don't know how to handle. -- Steinbach

Working...