LifeCLEF 2020 Plant

USD 5K as part of Microsoft's AI for earth program Prize Money
1 Authorship/Co-Authorship

Note: Do not forget to read the Rules section on this page. Pressing the red Participate button leads you to a page where you have to agree with those rules. You will not be able to submit any results before agreeing with the rules.

Note: Before trying to submit results, read the Submission instructions section on this page.

Challenge description

The goal of the challenge is to identify plants in field pictures based on a training set of digitized herbarium specimens. Concretely, this will consist in a cross-domain classification task with a training set composed of digitized herbarium sheets and a test set composed of field pictures. To enable learning a mapping between the herbarium sheets domain and the field pictures domain, we will provide both herbarium sheets and field pictures for a subset of species.


Despite recent progress in automated plant identification, a vast majority of the 300K+ plant species on earth can still not be recognized easily because of the lack of training data for that species. On the other side, for several centuries, botanists have collected, catalogued and systematically stored plant specimens in herbaria. These physical specimens are used to study the variability of species, their phylogenetic relationship, their evolution, or phenological trends. Millions of such specimens are now digitized and publicly available. Using them for training deep learning models is thus a very promising approach to help identifying data deficient species. However, their visual appearance is very different from field pictures which makes it a challenging cross-domain classification task.


The challenge will rely on a large collection of more than 300K herbarium sheets coming from two sources: the ``Herbier IRD de Guyane”, CAY) digitized in the context of the e-ReColNat project, and iDigBio, a large international platform hosting millions of images of herbarium specimens. A valuable asset of this collection is that a few hundreds of herbarium sheets are accompanied by a few pictures of the same specimen in the field. The test set is composed of about 3K in-the-field pictures collected by two botanists specialist of the Amazonian flora.

A link to the the training dataset is available under the “Resources” tab.

Submission instructions

As soon as the submission is open, you will find a “Create Submission” button on this page (next to the tabs).

Before being allowed to submit your results, you have to first press the red participate button, which leads you to a page where you have to accept the challenges rules.

More information regarding the submission instructions will be released soon.

Evaluation criteria

The metrics used for the evaluation of the task will be the mean reciprocal rank (primary metric used for the leaderboard) and the classification accuracy (used as secondary metric).


LifeCLEF lab is part of the Conference and Labs of the Evaluation Forum: CLEF 2020. CLEF 2020 consists of independent peer-reviewed workshops on a broad range of challenges in the fields of multilingual and multimodal information access evaluation, and a set of benchmarking activities carried in various labs designed to test different aspects of mono and cross-language Information retrieval systems. More details about the conference can be found here.

Submitting a working note with the full description of the methods used in each run is mandatory. Any run that could not be reproduced thanks to its description in the working notes might be removed from the official publication of the results. Working notes are published within CEUR-WS proceedings, resulting in an assignment of an individual DOI (URN) and an indexing by many bibliography systems including DBLP. According to the CEUR-WS policies, a light review of the working notes will be conducted by LifeCLEF organizing committee to ensure quality. As an illustration, LifeCLEF 2019 working notes (task overviews and participant working notes) can be found within CLEF 2019 CEUR-WS proceedings.


Participants of this challenge will automatically be registered at CLEF 2020. In order to be compliant with the CLEF registration requirements, please edit your profile by providing the following additional information:

  • First name

  • Last name

  • Affiliation

  • Address

  • City

  • Country

  • Regarding the username, please choose a name that represents your team.

This information will not be publicly visible and will be exclusively used to contact you and to send the registration data to CLEF, which is the main organizer of all CLEF labs


Information will be posted after the challenge ends.


Cloud credit

The winner of each of the challenge will be offered a cloud credit grant of 5k USD as part of Microsoft’s AI for earth program.


LifeCLEF 2020 is an evaluation campaign that is being organized as part of the CLEF initiative labs. The campaign offers several research tasks that welcome participation from teams around the world. The results of the campaign appear in the working notes proceedings, published by CEUR Workshop Proceedings (CEUR-WS.org). Selected contributions among the participants, will be invited for publication in the following year in the Springer Lecture Notes in Computer Science (LNCS) together with the annual lab overviews.


Contact us

Discussion Forum

  • You can ask questions related to this challenge on the Discussion Forum. Before asking a new question please make sure that question has not been asked before.
  • Click on Discussion tab above or direct link: https://discourse.aicrowd.com/c/lifeclef-2020-plant

Alternative channels

We strongly encourage you to use the public channels mentioned above for communications between the participants and the organizers. In extreme cases, if there are any queries or comments that you would like to make using a private communication channel, then you can send us an email at :

  • herve[dot]goeau[at]cirad[dot]fr
  • julien[dot]champ[at]inria[dot]fr
  • pierre[dot]bonnet[at]cirad[dot]fr
  • alexis[dot]joly[at]inria[dot]fr

More information

You can find additional information on the challenge here: https://www.imageclef.org/PlantCLEF2020