AEROSAT International Satellite Aerosol Science Network

Working Group on Intercomparisons


Objective


To coordinate complementary activities for intercomparing satellite aerosol datasets:

  • ​Identify gaps and opportunities for intercomparisons of aerosol satellite data (among themselves and with model datasets), which can provide added value to the understanding and appropriate use of the satellite datasets
  • Liaise closely with the AERO-SAT WG on uncertainties to support the gap analysis
  • Optimize communication to user communities on reasons for differences between satellite aerosol datasets and what we can learn about their overall uncertainties by intercomparisons


Participants


  • Thomas Popp, DLR
  • Ralph Kahn, NASA
  • Jan Griesfeller, Michael Schulz, MetNo
  • Stefan Kinne, MPI-Met
  • Yong Xue, London Metropolitan University
  • Gerrit de Leeuw, FMI and Helsinki University
  • Dave Winker, NASA


Extensive validation has been done for most current satellite aerosol datasets, but for every new dataset collection (i.e. global and covering several years of data), a comprehensive validation (to ground reference measurements) and intercomparison (to other satellite and model datasets) needs to be conducted to understand its strengths and limitations. A new dataset means either an incrementally new version of an existing dataset (upgraded algorithm, new algorithm) or a dataset from an entirely new instrument. Organizing this validation and intercomparison is the foremost responsibility of the respective data provider, whereas conducting the validation can and probably should be performed by both the instrument team and independent validation experts or user teams.


When it comes to broader intercomparisons of several datasets, AERO-SAT can contribute by

  • Identifying gaps in intercomparisons (satellite datasets, reference datasets)
  • Identifying necessary studies (intercomparisons, algorithm experiments) based on the outcome of the AERO-SAT uncertainties WG
  • Agreeing on metrics and tools for such intercomparisons
  • Proposing additional validation / intercomparison studies to fill identified gaps
  • Identifying relevant algorithm experiments to study the impact of specific algorithm / module changes (e.g. common cloud masking)


Within ESA’s Aerosol_cci project the use of the AEROCOM model evaluation tools and the “Kinne scoring” (evaluating spatial and temporal correlations) for daily gridded satellite datasets was chosen for model-user-oriented dataset validation in addition to direct level-2 validation – in all cases AERONET was used as reference source. Limitations in coverage for spectral AOD of AERONET may be overcome by adding a quasi-reference using a “best” satellite dataset over ocean or in remote regions can be agreed upon.  Limitations of AERONET particle type retrievals might be mitigated by field campaign data coincident with satellite observations, and by comparison with instruments that provide some constraints on particle type, such as POLDER and MISR. 


The link to the GCOS requirements for accuracy is not always easy and it could be analysed in how far validation could be optimized to provide exactly the same quantities as used in the GCOS satellite supplement (e.g. combined absolute +
relative accuracy threshold).  


Activities & Schedule


WG duration: Three years.  At the end of year 1 the WG shall review its gap analysis and will recommend concrete activities for years 2 and 3.

 

Year #1

  • Gap analysis (WG members) & proposing additional intercomparison studies (identify upcoming entirely new datasets, list existing comparisons)
  • Identify funded planned activities which could be linked and interact with them to assure maximum possible complementarity
  • Define appropriate metrices and tools focusing on response to GCOS
  • Seek additional funding for open highest priority studies under the advice of AERO-SAT


Year #2 + #3 (depending on results of year #1)

  • Integrate results from complementary intercomparisons and algorithm experiments responding to some of the identified gaps
  • Conduct additional studies if funding is secured (link outcomes from existing projects or add new funded activities)
  • Update the gap analysis based on available new results and datasets
  • Based on the findings integrated/made contribute to updating GCOS requirements/proposed metrices.


Outputs