Task 2.1 Forecast Solution Selection

Update of the IEA Recommended Practice on Forecast Solution Selection

Design of benchmark exercises: best practice.

  • Aim: Develop an IEA Recommended Practice on forecast benchmarking processes.
  • Partners: DTU Elektro, DTU Compute, WEPROG, UCD,, LNEG, forecast service providers, end users
  • Outcome: see Publication of Recommended Practice

The objective of Task 2.1 is to compile a guidance and a standard for private industry, academics and government in executing a renewable energy forecasting benchmark or trial. Benchmark and trial exercises can consume a lot of time both by the company conducting it and by the participating forecast providers. These guidelines and best practices will be based on years of industry experience and are intended to achieve maximum benefit and efficiency for all parties involved in the benchmark or trial exercise.
The goal of any company conducting a trial is to receive best quality forecasts for their particular situation and get their questions regarding the best solution for their problem answered. The participants on the other hand should be met with a transparent and competitive benchmark or trial exercise that enables them to show the benefits of their approach.
The reality mostly looks different. Resources required to carry our benchmarks or trials are in many cases highly underestimated. Often even the very basic questions that the clients need to answered for themselves, but also for the forecast vendors to decide whether or not to participate in a trial or benchmark exercise, are not defined at the outset of the exercises. Some of these fundamental questions are e.g.:
• What forecast horizons will be evaluated ?
• Will online data be available and how are curtailments and outages handled ?
• How long will the benchmark last?
• Which metrics are being used to evaluate forecast performance?
• How will the winner(s) be determined?
• Will there be awarded a contract after the trial

If such questions are not answered in the beginning, experience has shown that this leads to delays, cheating and unrealistic results at the end of the exercise. This task force will compile a guideline to help industry clients to be able to decide more easily on whether a benchmark or trial is the best way to answer burning questions and if so, how to best plan and carry out such benchmarks or trials. The guideline, recipes and examples as well as presentations on the topic are being provided on this page or in our publication directory. If you are interested in this topic or want to get updates, when new documentation is published, please contact the task leader for more information.

Corinna Möhrlen, WEPROG

Corinna Möhrlen

WEPROG, Weather and Energy Prognoses

    Corinna Möhrlen
    WEPROG, Weather and Energy Prognoses