The 2020 fastMRI challenge opens for submissions on October 1

September 17, 2020

Two years ago we announced fastMRI, a joint research project between Facebook AI and NYU Langone Health, intended to leverage AI to speed up magnetic resonance imaging (MRI) scans. FastMRI has been an open source project from the beginning because working openly and collaboratively with the community is the best way to push this vital scientific project forward. In 2019, we rallied researchers around our first community challenge, collectively exploring new approaches to the problem of reconstructing knee MRIs with much less raw data. We hope to recapture that spirit in 2020 for our second fastMRI community challenge, which begins October 1.

This year’s challenge centers on the neuroimaging dataset that the fastMRI project released last December. Neuroimaging accounts for approximately 60 percent of all MRIs and stands to benefit greatly from AI research, as scans are complicated by movements of the eye and of fluid around the brain. Reducing the time it takes to carry out scans could reduce the presence of these artifacts in images, enhance patient comfort, allow practitioners to serve more patients every day, and potentially even expand the uses for MRIs.

As part of a rigorous clinical study, radiologists recently examined AI-accelerated knee scans generated by fastMRI and found them diagnostically interchangeable with traditional MRIs, validating our research up to this point. While that was an encouraging milestone for the fastMRI project, we are still far from our goal, wherein AI helps streamline the reconstruction of all MRIs. We hope that by spurring community involvement with this latest challenge, new voices and new approaches will help galvanize progress on this problem.

The neuroimaging dataset is available now, and the public leaderboard is open for practice submissions. The competition begins October 1, at which point the challenge dataset will be made available. Challenge submissions will also go live at this point, split into multicoil tracks with 4x and 8x acceleration, as well as a 4x transfer track that features data from scanners that are not included in the training set. The transfer track is especially important, as we ultimately want to assess whether methods work across multiple hardware setups — though please note that models submitted to this track must be trained only on the fastMRI dataset released by NYU Langone.

Like last year’s fastMRI Challenge, submissions will be evaluated first for Structural Similarity (SSIM), a widely used metric to quantify changes in the structural information of an image. Reconstructions with the highest SSIM scores will then be passed to a panel of radiologists to assess not just image accuracy but how useful submitted reconstructions are for diagnosing abnormalities and pathologies. This is important because fastMRI ultimately must produce images that not only are similar to traditional MRIs but also work well in a clinical setting. The panel we’ve assembled for this purpose spans multiple countries and institutions, including Mayo Clinic, Baylor College of Medicine, UPMC, UCLA, Stanford University, and University of Genoa.

Please review the submission guidelines before entering, and note that participants must submit their reconstructions to by October 15. The winning team from each track will be invited to nominate a team member to share their work (virtually) at the Medical Imaging Meets NeurIPS Workshop at NeurIPS 2020, at which point we will also publicize the challenge leaderboards.

Our first community challenge saw a mix of participants from academia, the machine learning community, the medical research community, and major MRI scanner manufacturers, and our hope is to drive the same level of collaboration this year. We firmly believe that researchers benefit from these types of open, public challenges, and that together we can improve MRI scans for the benefit of people around the world.

Written By

Matthew Muckley

Research Engineer

Nafissa Yakubova

Program Manager