NOAA Webinar on Proactive Quality Control based on Ensemble Forecast Sensitivity to Observation, EFSO

NOAA’s has a webinar on Proactive Quality Control based on Ensemble Forecast Sensitivity to Observation, EFSO.    It is a important topic on just how sensitive models are to the observations they are started off with for initial conditions.  The talk by Daisuke Hotta talks about how lack of observations can seriously degrade the model’s performance.


Speaker: Daisuke Hotta (UMD-JMA)

Date, Time: July 16,2014 at noon Place: NCWCP Conference Center

Go To Meeting: TBD

Abstract: Despite recent major improvements in numerical weather prediction (NWP) systems, operational NWP forecasts occasionally suffer from an abrupt drop in forecast skill, a phenomenon called “forecast skill dropout.” Recent studies have shown that the “dropouts” occur not because of the model’s deficiencies but by the use of flawed observations that the operational quality control (QC) system failed to filter out. Thus, to minimize the occurrences of forecast skill dropouts, we need to detect and remove such flawed observations. A diagnostic technique called Ensemble Forecast Sensitivity to Observation (EFSO) enables us to quantify how much each observation has improved or degraded the forecast. A recent study (Ota et al., 2013) has shown that it is possible to detect flawed observations that caused regional forecast skill dropouts by using EFSO with 24-hour lead-time and that the forecast can be improved by not assimilating the detected observations. Inspired by their success, in the first part of this study, we propose a new QC method, which we call Proactive QC (PQC), in which flawed observations are detected 6 hours after the analysis by EFSO and then the analysis and forecast are repeated without using the detected observations. This new QC technique is implemented and tested on a lower-resolution version of NCEP’s operational global NWP system. The results we obtained are extremely promising; we have found that we can detect regional forecast skill dropouts and the flawed observations after only 6 hours from the analysis and that the rejection of the identified flawed observations indeed improves 24-hour forecasts. In the second part, we show that the same approximation used in the derivation of EFSO can be used to formulate the forecast sensitivity to observation error covariance matrix $\R$, which we call EFSR. We implement the EFSR diagnostics in both an idealized system and the quasi-operational NWP system and show that it can be used to tune the matrix so that the utility of observations is improved. We also point out that EFSO and EFSR can be used for the optimal assimilation of new observing systems.


About Chuck Schoeneberger

Former forecaster at Meridian Environmental Technologies Inc (now a Interis Company), with a background in GIS and LiDAR, with other stints at GeoSpatial Services of Winona, MN and Aerometric (Now Quantum Spatial) of Sheboygan, WI. He is a weather technologist for public storm safety from a local to international level. LinkedIn Profile: His views are his own and not of his employer.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s