Welcome to the FDA Adverse Drug Event Evaluation Challenge

FDA invites you to participate in the Adverse Drug Event Evaluation (ADE Eval), an evaluation of tools to identify adverse events (AEs) mentioned in publicly available drug labels. The FDA CDER Office of Surveillance and Epidemiology (OSE) is sponsoring this ADE Eval. OSE is interested in a tool that would enable pharmacovigilance safety evaluators to automate the identification of labeled AEs which could facilitate triage, review and processing of safety case reports. MITRE (www.mitre.org) is conducting this evaluation on behalf of FDA CDER OSE.

This evaluation is similar to an earlier evaluation on adverse drug reactions conducted as a track of the NIST Text Analysis Conference (TAC), but uses a definition of adverse drug events specific to the business process in the Office of Surveillance and Epidemiology. The task will consist of identifying OSE-defined adverse drug events and mapping them to associated terms in the Medical Dictionary for Regulatory Activities (MedDRA – www.meddra.org) for specific sections of drug labels. Evaluation metrics will include precision and recall of mention-level extraction of AEs and their MedDRA encoding.

Please note that this invitation is not to be construed as a commitment on the part of the FDA or MITRE to award a contract, nor does the FDA or MITRE intend to pay for any information submitted as a result of participation in the ADE Eval. Annotation Guidelines used to extract adverse events from drug labels were developed for research purposes and do not reflect FDA policy.

Visit this site to receive access to the training data and associated materials.

We look forward to your participation.

-FDA ADE Evaluation Organizers

Robert Ball, FDA
Sonja Brajovic, FDA
Oanh Dang, FDA
John Aberdeen, MITRE
Samuel Bayer, MITRE
Cheryl Clark, MITRE
Lynette Hirschman, MITRE

 

The organizers gratefully acknowledge the contribution of FDA colleagues and pharmacovigilance safety evaluators to the annotation effort for this evaluation.

Submissions & System Descriptions Due

Day(s)

:

Hour(s)

:

Minute(s)

:

Second(s)

Evaluation Organization and Schedule:

Participants will be provided with an annotated training corpus, detailed OSE-specific annotation guidelines and an evaluation script.

The schedule for the evaluation is as follows:

8 November 2018 Release of annotation guidelines, annotated training set, scorer and ancillary tools

28 January 2019

Release of 2000-label unannotated test set, a small subset of which will be evaluated

1 February 2019

Submissions and 2-page system descriptions due

28 February 2019

Results returned to participants

Public Release #18-3780