Can we use the given submission sample?

Can we use the given submission sample file when generating solutions?
It looks like the submission sample has the correct final shapes of each slice.
That is, you randomly reshuffled (and 180 deg rotated) patches, but did not slide them up/down randomly.

If yes, then can we assume that during the final hidden testing on your side, you will provide
a corresponding submission sample file for your hidden test slices?

1 Like

A related question about the sample submission: Can we start with the submission sample and restore the correct patch ids and their rotaions (0 or 180 deg).
It looks like your scoring method uses your submission sample.
And if a submission json does not match your submission sample (with correct patch ids), we get a reduced score.
For example, some of my restored slices are 180 degrees rotated (the whole image).
Is it reasonable to expect that it should not matter?
However, when I rotate such an image 180 (including the corresponding values in json), I get a different score. That is strange. I will try to match my submission json to be exactly as as your sample next. I reviewed the restored images from my submission, and I do not see any errors. Yet, I am not getting a score of 1.

Hello @jc138691
Thank you for your questions.

It’s important to clarify that the sample submission file is provided only to illustrate the correct format for submissions and should not be used as a basis for generating solutions. The scoring algorithm does not use the sample submission file as a reference for evaluating submissions. So, for the hidden test slices, we will not be providing a similar sample submission file.

As described in the rules, our evaluation is based on the similarity of your reconstruction to our original images, considering both direct and fully rotated option, with penalties applied for misplaced patches. If all your patches are correctly positioned but the coordinates do not exactly match ours, you might observe a slight reducing in the score due to the way the NCC (Normalized Cross-Correlation) similarity metric is calculated.

We hope this information is helpful. Please don’t hesitate to reach out if you have more questions.

Kind regards,
Onward Team

Thank you for clarifying that the sample submission should not be used. My solution does not use it.

1 Like

Since we can not use the sample submission, please adjust your scoring code accordingly.

I rotated my 96 perfectly reconstructed slices to match your sample submission and got a score 0.83. Then I made the same (96 slices) to be 180 rotation of your sample submission and got a score 0.9998. It is reasonable to expect the scores should be the same.

It looks like (I am guessing), when you rotate a submission slice by 180 degrees, you then set minimum y to match your minimum y (or set your min y to match submission min y). When you do not rotate a submission slice, you do not check if min_y’s are the same. My submission has min_y=0 in all slices (why would it be anything else?). Your sample submission min_y’s are not zero, e.g. 3 in the first slice.

2 Likes

Hi @jc138691

Thank you for your observation. We will check the mentioned case and come back with updates.

Kind regards,
Onward Team

Hello @team , Since this challenge is about to end, did you have a chance to sort out your scoring code?
Specifically, 180-degree rotated reconstruction should give the same score as the corresponding 0-degree case.

Hi @jc138691

This issue with the scoring has been resolved.

Happy puzzling

Onward Team

1 Like