Results & Ranking

17 teams/individuals registered for the competition. 5 delivered the executable files, one of them delivering 3 different executable files. All of them produce results for task 1 and for task 2.

TASK 1 RANKING

Team Accura­cy (%) Ranking according to Accuracy Average Intraclass Distance (AID) Ranking according to AID
DeepScript (University of Antwerp, Mike Kestemont) 76.49 4 0.039 3
FAU (University of Erlangen-Nürnberg, Vincent Christlein) 83.90 1 0.068 4
FRDC-OCR (Fujitsu Research & Development Center, Song Wang) 79.80 3 0.018 1
NNML (Brigham Young University, Christopher Tensmeyer) 83.80 2 0.026 2
TAU-1 (Tel Aviv University, Arie Shaus, method 1) 49.90 7 0.421 7
TAU-2 (Tel Aviv University, Arie Shaus, method 2) 50.10 6 0.417 6
TAU-3 (Tel Aviv University, Arie Shaus, method 3) 52.80 5 0.393 5
TASK 2 RANKING

Team Final Score Ranking according to Score Average Intraclass Distance (AID) Ranking according AID
DeepScript (University of Antwerp, Mike Kestemont) 2.967 1 0.146 3
FAU (University of Erlangen-Nürnberg, Vincent Christlein) 2.784 2 0.174 4
FRDC-OCR (Fujitsu Research & Development Center, Song Wang) 2.631 4 0.120 1
NNML (Brigham Young University, Christopher Tensmeyer) 2.771 3 0.134 2
TAU-1 (Tel Aviv University, Arie Shaus, method 1) 0.615 6 0.260 6
TAU-2 (Tel Aviv University, Arie Shaus, method 2) 0.590 7 0.259 5
TAU-3 (Tel Aviv University, Arie Shaus, method 3) 1.226 5 0.356 7

If you want to be listed here, please send us your results as CSV files and we’ll perform the evaluation.