MS segmentation challenge using a data management and processing infrastructure
As part of an ongoing effort, towards automatic segmentation of MRI scans of MS patients, of the OFSEP (French registry on multiple sclerosis aiming at gathering, for research purposes, imaging data, clinical data and biological samples from the French population of multiple sclerosis subjects) and FLI (France Life Imaging), we are organizing a challenge on MS lesions segmentation. This challenge will take place during MICCAI 2016, on October 21st 2016.
The goals of this challenge are multiple. We first aim at evaluating state-of-the-art and advanced segmentation methods from the participants on a database following a standard protocol (used by OFSEP, see this article for more information). For this, we will evaluate both lesion detection (how many lesions were detected) and lesion segmentation (how precise the lesions are delineated) on a multi-centric database (38 patients from four different centers, imaged on 1.5 or 3T scanners, each patient being manually annotated by seven experts).
In addition to this classical evaluation, the goal of this challenge is also to provide a common infrastructure on which the algorithms will be evaluated (more details here). This infrastructure will enable a fair comparison of the algorithms in terms of running time comparison and ensuring all algorithms will be run with the same parameters for each patient (which is required for a truly automatic segmentation).
This website provides access to the following information:
- Key dates of the challenge organization and submission guidelines
- Data access (training only, available from April 8) and general data information
- Pipeline integration information: how segmentation algorithms will be run on the computing platform
- General information and details on the evaluation metrics for the challenge
Workshop Results and Challenge Continuation
The workshop took place on October 21, 2016, with a total of 13 participating teams. Presentations from the different teams, proceedings and results of the challenge are available on the challenge day page. A journal article is available at Nature scientific reports on the challenge results, it should be cited whenever using the challenge data or results. It is available at the following DOI: https://dx.doi.org/10.1038/s41598-018-31911-7. Supplemental results data is avilable at zenodo: http://doi.org/10.5281/zenodo.1307653
As of now, the challenge is currently closed for submissions. We are actively looking for ways to revive submissions for new algorithms to be evaluated in the same framework. For this reason, you may download in the meantime only the training database (after account registration) and no access to the testing database will be given yet.
For any enquiries on the challenge or the website, please contact: email@example.com