Some Notes on Evaluation

The submission platform will be opening in just a few days now! I want to use this post to reiterate and clarify a number of things about the submission and evaluation process.

  1. Predictions should be submitted as a zip archive of files in the Nifti1Image format, as outlined on the challenge rules page. As Fabian pointed out, these instructions were previously inconsistent, but this has since been fixed. An example of how to save Numpy arrays as Nifti1Images can be found here.

  2. You may submit as many times as you like during the submission period, but only your most recent submission will be evaluated for the challenge. The leaderboard will not be made available until after the deadline in order to prevent fine-tuning on the test set.

  3. Previously, the rules stated that you must have a URL to your manuscript. We now require only the PDF file with submission. These will be published as a University of Minnesota Digital Collection, and will be citable on Google Scholar. We will moderate these manuscripts to ensure sufficient detail, and we may ask for revisions before publication. All teams will be allowed to edit their manuscripts after the leaderboard has been made public.

  4. If you haven’t already, please fill out this short survey about your participation in the challenge. This really helps us to understand how to run better challenges in the future.

Please don’t hesitate to voice any questions or concerns about this process! I’ll be sure to get to them as soon as I can.