NASA/CR—2014–218474, Rev 1 Real-time Kennedy Space Center and Cape Canaveral Air Force Station High-resolution Model Implementation and Verification Jaclyn A. Shafer ENSCO, Inc., Cocoa Beach, Florida NASA Applied Meteorology Unit, Kennedy Space Center, Florida Leela R. Watson ENSCO, Inc., Cocoa Beach, Florida NASA Applied Meteorology Unit, Kennedy Space Center, Florida January 2015 This page is required and contains approved text that cannot be changed. NASA STI Program ... in Profile Since its founding, NASA has been dedicated • CONFERENCE PUBLICATION. to the advancement of aeronautics and space Collected papers from scientific and science. The NASA scientific and technical technical conferences, symposia, seminars, information (STI) program plays a key part in or other meetings sponsored or helping NASA maintain this important role. co-sponsored by NASA. The NASA STI program operates under the • SPECIAL PUBLICATION. Scientific, auspices of the Agency Chief Information Officer. technical, or historical information from It collects, organizes, provides for archiving, and NASA programs, projects, and missions, disseminates NASA’s STI. The NASA STI often concerned with subjects having program provides access to the NTRS Registered substantial public interest. and its public interface, the NASA Technical Reports Server, thus providing one of the largest • TECHNICAL TRANSLATION. collections of aeronautical and space science STI English-language translations of foreign in the world. Results are published in both non- scientific and technical material pertinent to NASA channels and by NASA in the NASA STI NASA’s mission. Report Series, which includes the following report types: Specialized services also include organizing and publishing research results, distributing • TECHNICAL PUBLICATION. Reports of specialized research announcements and completed research or a major significant feeds, providing information desk and personal phase of research that present the results of search support, and enabling data exchange NASA Programs and include extensive data services. or theoretical analysis. Includes compila- tions of significant scientific and technical For more information about the NASA STI data and information deemed to be of program, see the following: continuing reference value. NASA counter- part of peer-reviewed formal professional • Access the NASA STI program home page papers but has less stringent limitations on at http://www.sti.nasa.gov manuscript length and extent of graphic presentations. • E-mail your question to [email protected] • TECHNICAL MEMORANDUM. • Phone the NASA STI Information Desk at Scientific and technical findings that are 757-864-9658 preliminary or of specialized interest, e.g., quick release reports, working • Write to: papers, and bibliographies that contain NASA STI Information Desk minimal annotation. Does not contain Mail Stop 148 extensive analysis. NASA Langley Research Center Hampton, VA 23681-2199 • CONTRACTOR REPORT. Scientific and technical findings by NASA-sponsored contractors and grantees. NASA/CR-2014-218474, Rev 1 Real-time Kennedy Space Center and Cape Canaveral Air Force Station High-resolution Model Implementation and Verification Jaclyn A. Shafer ENSCO, Inc., Cocoa Beach, Florida NASA Applied Meteorology Unit, Kennedy Space Center, Florida Leela R. Watson ENSCO, Inc., Cocoa Beach, Florida NASA Applied Meteorology Unit, Kennedy Space Center, Florida National Aeronautics and Space Administration Kennedy Space Center Kennedy Space Center, FL 32899-0001 January 2015 Acknowledgments The authors would like to thank Mr. Kevin McGrath and Dr. Geoffrey Stano of the Short-term Prediction Research and Transition Center and Mr. Erik Magnuson of ENSCO, Inc. for lending their time and expertise with the Advanced Weather Interactive Processing System II in order to get the real-time model output available in AWIPS II. Dr. Bill Bauman and Ms. Winnie Crawford also provided valuable feedback and assistance with completing this task. Available from: NASA Center for AeroSpace Information 7115 Standard Drive Hanover, MD 21076-1320 443-757-5802 This report is also available in electronic form at http://science.ksc.nasa.gov/amu Executive Summary Customer: NASA’s Launch Services Program (LSP), Ground Systems Development and Operations (GSDO), and Space Launch System (SLS) programs NASA’s LSP, GSDO, SLS and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). For example, to determine if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 km Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters’ and Launch Weather Officers’ (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the AMU high-resolution WRF Environmental Modeling System (EMS) model (Watson 2013) in real-time. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The model was set up with a triple-nested grid configuration over KSC/CCAFS based on previous AMU work (Watson 2013). The outer domain (D01) has 12-km grid spacing, the middle domain (D02) has 4-km grid spacing, and the inner domain (D03) has 1.33-km grid spacing. The model runs a 12-hr forecast every hour, D01 and D02 domain outputs are available once an hour and D03 is every 15 minutes during the forecast period. The AMU assessed the WRF-EMS 1.33-km domain model performance for the 2014 warm season (May–September). Verification statistics were computed using the Model Evaluation Tools, which compared the model forecasts to observations. The mean error values were close to 0 and the root mean square error values were less than 1.8 for mean sea-level pressure (mb), temperature (K), dewpoint temperature (K), and wind speed (ms-1), all very small differences between the forecast and observations considering the normal magnitudes of the parameters. The precipitation forecast verification results showed consistent under-forecasting of the precipitation object size. This could be an artifact of calculating the statistics for each hour rather than for the entire 12-hour period. The AMU will continue to generate verification statistics for the 1.33-km WRF-EMS domain as data become available in future cool and warm seasons. More data will produce more robust statistics and reveal a more accurate assessment of model performance. Once the formal task was complete, the AMU conducted additional work to better understand the wind direction results. The results were stratified diurnally and by wind speed to determine what effects the stratifications would have on the model wind direction verification statistics. The results are summarized in the addendum at the end of this report. In addition to verifying the model’s performance, the AMU also made the output available in the Advanced Weather Interactive Processing System II (AWIPS II). This allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations AWIPS II client computers and conduct real-time subjective analyses. In the future, the AMU will implement an updated version of the WRF-EMS model that incorporates local data assimilation. This model will also run in real-time and be made available in AWIPS II. 3 Table of Contents Executive Summary ...................................................................................................................... 3 Table of Contents ......................................................................................................................... 4 List of Figures ............................................................................................................................... 5 1 Introduction ............................................................................................................................ 8 2 Model Installation and Configuration ..................................................................................... 9 3 Model Forecast Verification ................................................................................................. 11 Observational Data .................................................................................................................. 11 3.1.1 MADIS ................................................................................................................... 11 3.1.2 Stage IV ................................................................................................................ 12 Verification Software ............................................................................................................... 12 3.1.3 Point-Stat Tool ...................................................................................................... 13 3.1.4 MODE Tool ........................................................................................................... 13 Warm Season Verification Results .......................................................................................... 13 3.1.5 Surface Parameters .............................................................................................. 13 3.1.6 Precipitation .......................................................................................................... 16 3.1.7 Summary of Results .............................................................................................. 20 4 WRF-EMS Output into AWIPS II ......................................................................................... 21 CAVE Examples ...................................................................................................................... 21 Procedures to Display WRF-EMS Output ............................................................................... 23 5 Summary and Future Work ................................................................................................. 26 References ................................................................................................................................. 27 List of Acronyms ......................................................................................................................... 28 Addendum .................................................................................................................................. 29 A.1 Diurnal Stratification ..................................................................................................... 29 A.2 Speed Stratification ...................................................................................................... 30 A.3 Continuing and Future Work ........................................................................................ 31 4 List of Figures Figure 1. Map of the eastern United States showing the boundaries of each domain. The outer domain (cyan rectangle, D01) has 12-km grid spacing, the middle domain (green rectangle, D02) has 4-km grid spacing, and the inner domain (yellow rectangle, D03) has 1.33-km grid spacing. The AMU calculated verification statistics for the inner domain, D03. .............................................................................................................. 10 Figure 2. Map of the mesonet (blue squares) and METAR (blue squares with red circles) weather station locations from MADIS. The AMU used the data from these sites to calculate verification statistics for the inner WRF-EMS domain. ................................ 12 Figure 3. The ME for each parameter versus forecast hour. Surface pressure is in mb (blue dots), temperature in K (green dots), dewpoint temperature in K (red dots) and wind speed in ms-1 (purple dots). ........................................................................................ 14 Figure 4. The ME for wind direction in degrees versus forecast hour. ...................................... 15 Figure 5. Same as Figure 3 but for RMSE. ................................................................................ 15 Figure 6. Same as Figure 4 but for RMSE. ................................................................................ 16 Figure 7. Illustration of the technique used in the MODE tool to define precipitation objects: a) raw gridded precipitation data, b) smoothed data, c) convolved field, d) final field of objects used in verification statistics (from the MET users guide, Figure 8-1). .......... 17 Figure 8. Centroid distance versus model forecast hour for the warm season model verification. Centroid distances are in number of grid boxes. ........................................................ 18 Figure 9. Area ratio versus model forecast hour for the warm season model verification. ........ 19 Figure 10. Interest value versus model forecast hour for the warm season model verification. .................................................................................................................................... 19 Figure 11. CAVE screen shot of the AMU WRF-EMS 12-km frontogenesis output valid at 1900 UTC on 10 July 2014. The warm colors are frontogenesis and the cool colors are frontolysis. ................................................................................................................. 21 Figure 12. Same as Figure 11 but for the 4-km domain. ............................................................ 22 Figure 13. Same as Figure 11 but for the 1.33-km domain. ....................................................... 22 Figure 14. CAVE window highlighting how the user selects the Volume Browser to display model data. ............................................................................................................... 23 Figure 15. Volume Browser window in CAVE. The AMU WRF-EMS is selected by clicking the Volume drop-down menu under Sources. ................................................................. 24 Figure 16. A partial list of available models from the Volume drop-down showing the three AMU WRF-EMS domains at the top of the list. .................................................................. 24 Figure 17. Volume Browser showing the selection of AMU WRF-EMS parameters and associated available products. .................................................................................. 25 Figure 18. AMU WRF-EMS 12-km forecast surface wind barbs (green). .................................. 25 Figure 19. Zoomed in image of the product details shown in Figure 18. The red box outlines the product name, the blue box is the date and model run time (Z), the purple box is the forecast hour and the green box is the valid date and time. ..................................... 26 5 Figure 20. Wind speed (ms-1) RMSE versus model forecast hour stratified diurnally. All values are in blue, Day values (0600–1759 EDT) are in red and Night values (1800–0559 EDT) are in green. ..................................................................................................... 29 Figure 21. Same as Figure 20 but for wind direction (degrees). ................................................. 30 Figure 22. Wind speed (ms-1) RMSE versus model forecast hour stratified by wind speed. The All Speeds category is in purple, <5 is in blue and 5–10 is in red. ............................ 31 Figure 23. Same as Figure 22 but for wind direction (degrees). ................................................. 31 6 List of Tables Table 1. List of statistics available in the MODE tool the AMU used to verify the model. .......... 17 Table 2. List of model forecast wind speed sample sizes for each category per model forecast hour. ............................................................................................................................. 30 7 1 Introduction NASA’s Launch Services Program, Ground Systems Development and Operations, Space Launch System and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). Examples include determining if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 km Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters’ and Launch Weather Officers’ (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the high-resolution WRF Environmental Modeling System (EMS) model configured by the AMU (Watson 2013) in real time. Implementing a real-time version of the ER WRF-EMS would generate a larger database of model output than in the previous AMU task for determining model performance, and allows the AMU more control over and access to the model output archive. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The AMU also calculated verification statistics to determine model performance compared to observational data. Finally, the AMU made the model output available on the AMU Advanced Weather Interactive Processing System II (AWIPS II) servers, which allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations (RWO) AWIPS II client computers and conduct real-time subjective analyses. 8