Submission Instructions📜¶
General Information¶
Please join our challenge before making submissions. Note that only participants with verified accounts can make submissions to this phase. Our challenge is held as an "algorithm submission" to enable fair comparison across participants, and the evaluation datasets of all phases are hidden. Thus, participants should submit a docker container to properly get a place on the leaderboard.
Form a team¶
Participants can form a team through the Teams.
Baseline Example¶
A baseline algorithm using a U-Net is provided at GitHub. More details are in the forum.
How to submit an algorithm?¶
Please follow the below steps to make an algorithm submission. For more details, please check out the GitHub repository.
Step 1. Develop the cell detection model with the training dataset
on Zenodo.
Step 2. Clone the GitHub repository provided by organizers.
Step 3. Properly update the `process_patch_pair` function in
`user/inference.py`. Don't forget to add the dependencies (e.g.
PyTorch) in the `requirements.txt `.
Step 4. Export the algorithm docker container by running `bash
export.sh` which will create a `.tar` file.
Step 5. Submit the created `.tar` file through the submission system.
Report and GitHub repository link¶
In the test phase, two additional materials are required to be submitted (a) a report and (b) GitHub repository URL. Please read below for the details of each item.
-
(a) report of 4 pages in Springer format that summarizes the participants’ method. The report should include information that is needed to reproduce the results, for example, data processing, hyperparameter tuning, method, preliminary results in the validation set, etc.
-
(b) GitHub repository URL that provides the source code. It should include the environment (e.g. Docker) and all relevant source code to generate the results. The repository can be either public or private before the challenge ends, however, it must be public when the challenge event at MICCAI 2023 takes place and remain public afterward. Participants have to mention the OCELOT 2023 challenge in the README of their GitHub repository.
Limitation on the number of submissions¶
We limit the number of submissions for validation and test phases to 10 and 1 times, respectively. Note that, the final ranking is only based on the test phase score, not the validation phase score.