Participants should containerize their algorithms with Docker and submit these to organizers for evaluation. The test data will not be released to the public.
The challenge organizers will run your method on all test data. This guarantees that the test data remains secret and cannot be included in the training procedure. This workflow has proven successful for previous MICCAI challenges. (e.g. IVDM3Seg Challenge)
An easy-to-follow example to containerize your algorithms with Docker can be found here (updated on Dec. 25th).
Note: Each top-ranking team will be invited to submit a 3-page report using LNCS and an oral speech at MICCAI 2019 to describe methodology of their approach. All the top-ranking teams will be invited to jointly author a paper on describing the latest progress on automatic OAR and GTV delineation based on the challenge results.
Apart from the award certificates and trophies, the top-ranking teams will be invited to publish full papers to describe their top-ranking methods in Neurocomputing (IF: 4.072) and also be awarded free MICCAI registrations. Below are the guidelines for the coming final submissions.
Public Submit¶
(Updated Dec. 26th, 2019)
The Submission site is now available here.¶
Notice:
- Public submit page will be available from 13:00 Dec. 26th, 2019 to 13:00 July 31th, 2020, UTC-8 Beijing Time.
- After August 1st, the StructSeg 2020 challenge will take place. (Tentative)
Here's the guide for public submission:¶
How to fill the Register form:¶
- USERNAME only contains English letters, numbers and underline.
- EMAIL is your own institution email address.
- Please take a picture or a screenshot for your username and email. If you forget your username or email, you have to register a new account with a new email address, and lost all your previous account’s submissions.
- Each EMAIL is allowed to register only once.
How to fill the Submit form:¶
- USERNAME and EMAIL are the ones you registered before.
- METHOD is the name of the submission. Each METHOD name should be unique, or add v1, v2 to distinguish different versions of one method.
- Fill in the DOWNLOAD LINK with the download link of your docker image. We only accept Google Drive link and DropBox link without any password. Please follow this tutorial. BaiduPan is strictly not available during public submission.
- AFFILIATION and CONTRIBUTORS are related to the team you work with on this submission. You can edit those related information after submission is finished.
- METHOD DESCRIPTION is the description of the method. You can edit this information after submission is finished.
When you fill in the SEARCH form:¶
- USERNAME and EMAIL are the ones you registered before.
Submission rules:¶
- For each task, each participant can join multiple teams. The leaderboard is only related to each METHOD.
- For each task, a participant's submission interval must be no less than 48 hours.
- Our evaluation server is equipped with an Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz, 250GB memory and 4-NVIDIA GeForce GTX 1080 GPUs.
- Each Docker tar file size is preferred less than 10 Gb. A Docker size of over 15 Gb will raise an Error.
- Docker is running under network disconnection, please check your pre-trained model downloaded. Other actions trying to connected to the Internet will raise an Error.
- The algorithm is allowed to execute on the CPU and on a single GPU. It should occupy no more than 10GB GPU memory and no more than 16GB CPU memory.
- For each task, the algorithm can execute for at most 4 hours to generate the output. Otherwise, an error will be returned.
- Test images in second task contains big images whose longest edge exceeds 10000 , but less than 20000. Sliding window inference is recommended.
- CUDA 10 available this time.
- On search page, a LOG less than 1M can be downloaded when the submission is finished.
- For a successful submission, participant can click PUBLIC/PRIVATE to choose whether put this submission to Leaderboard or withdraw this submission from Leaderboard.
- If one successful submission is not PUBLIC during** 7 days**, this submission will be DELETED for storage concern.
- AFFILIATION, CONTRIBUTORS, and METHOD DESCRIPTION can be edited on search page before you put it to Leaderboard. Those information will be displayed on ABBREVIATION form on Leaderboard Page.
- Only PUBLIC submissions will rank in Leaderboard.
- Notice that 13:00 means 1:00 pm.
If you have any questions or comments, please mail to support@structseg-challenge.org. We will try our best to check and reply mails every week.¶
Challenge Submit¶
(Updated Oct. 13th, 2019)
The Challenge Submission site is not available. StructSeg 2020 challenge submission site will reopen in autumn 2020.¶
Notice:
- Please strictly follow the latest docker-example here to containerize your algorithms, and follow this tutorial to generate direct download link.
- Submit page will be available from 1~~1:59 Sept. 19th to 11:59 Sept. 26th~~.
- Before Oct. 1st, SEARCH button can only return the status of submitted Docker image.
- After Oct. 1st, you could SEARCH your highest score by task.
- After Oct. 1st, a result page will be released, where you can find top10 leaderboard.
Reminder for challenge submission¶
- [updated 09-20] Your docker image name should be:structseg:UserName_taskN.
- Your tar file name should be: UserName_taskN.tar
- The download link for GoogleDrive should be direct link, tutorial here. And also check the download permission:
- 1. Check the share permission with 'anyone with the link';
- 2. Get the shareable link;
- 3. Copy the link to your browser;
- 4. Click the Download button;
- 5. Copy the link which ends up with '&export=download', and also with a 'uc?id=1' in the middle of the link.
- You could submit once per hour (not per day).
- We strongly advise to submit at least once in this week. This could help find errors of some details like download link, docker and others. This week we send emails one by one to help correct submission errors.
- [updated 09-23] We have an automatic system for submission if you use GoogleDrive and Dropbox. The full process including download docker, run docker, run evaluation, is all automatic and very fast. But if you use other drives, especially BaiduPan, we have to download the docker image and run it manually, which takes a lot of time to get the final evaluation scores. Please use GoogleDrive and Dropbox as mush as you can to get faster feedback.
- **[updated 09-24] LAST 12 HOURS TO GO! **
Here's the guide for challenge submission:¶
How to fill the SUBMIT form:¶
- TEAM NAME only contains English letters, numbers and underline. Each team only needs to register one ID.
- Fill in the TEAM MEMBERS with all your team members' names and institution email addresses. Third-party email addresses (e.g., hotmail.com, gmail.com, 163.com, etc.) will not be considered as valid submissions.
- One or multiple affiliations can be associated with your team, which could be universities, institutions, or companies that team members are affiliated with.
- USERNAME is your grand challenge ID.
- EMAIL is your own institution email address.
- Fill in the DOWNLOAD LINK with the download link of your docker image. We only accept Google Drive link and DropBox link without any password. Please follow this tutorial.
When you fill in the SEARCH form:¶
- USERNAME is your grand challenge ID.
- EMAIL must be the same with your grand challenge account.
Submission rules:¶
- For each task, each team can have one or multiple members. Each participant can only be affiliated with one team. If a participant appears in multiple teams, all his/her submissions will be revoked.
- A participant can join different teams for the different tasks.
- For each task, the submission interval must be no less than one hour.
- For each team and each task, only its LAST THREE submissions before the deadline will be evaluated for final ranking. The one with the highest ranking among the three submissions will be used as the final rank.
- Challenge time is UTC-8 Beijing time.
- Submission deadline extends to Sept. 26rd, 2019
- Our evaluation server is equipped with an Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz, 250GB memory and 8-NVIDIA GTX 1080Ti GPUs.
- The algorithm is allowed to execute on the CPU and on a single GPU. It should occupy no more than 10GB GPU memory and no more than 16GB memory.
- For each test case, the algorithm can execute for at most ~60~120 seconds to generate the output. Otherwise, an error will be returned.
- The required cuda version is cuda 8 or cuda 9 (cuda 10 is not supported!).
Submission feedback:¶
- a.For each submission, you could search the docker evaluation status after the evaluation is finished. A search result could be:
- i.Whether the docker is valid. If it is invalid, it will return error log.
- ii.Whether the result of the docker is valid. If it is valid (DSC > 5%), it will only return that the submission is valid
- iii.No specific numeric result will be shown to the participants before the challenge deadline.
- b.The submission site will be open from 11:59 Sep. 19th to 11:59 Sep. 26th.
- c.We will inform top-10 teams of each task via emails between Sep. 26th to Oct. 1st.
- d.On Oct. 1st, the leaderboard will be released, containing top-10 teams of each task. Other team could search their highest score by SERCH form.
- e.The submission site will be reopened after the challenge in MICCAI2019.
To top-ranking teams:¶
- a.The top-ranking teams on each task will be invited to submit full papers on describing their methods to the special issue on Deep learning for Medical Image Computing of Neurocomputing (IF= 4.072). The guest editors are
- Hongsheng Li (SenseTime Research & CUHK)
- Shaoting Zhang (SenseTime Research)
- Dimitris N. Metaxas (Rutgers University)
- b.The top-ranking teams will be invited to give talks in the challenge ceremony held in conjunct with MICCAI 2019.
- c.The top-ranking teams will be awarded challenge certificates and challenge trophies.
- d.The top-ranking teams will be awarded free MICCAI registrations.
- e.The top-ranking teams will be required to submit a 4-page technical report in Springer LNCS format by Oct. 10th to briefly describe their methodology. An overview paper of the challenge dataset and challenge result will be submitted to a top-ranking journal in medical image understanding. This paper is expected to receive high citations in the future.