Results OAEI 2013::Large BioMed Track
The following tables summarize the results for the tasks in the FMA-SNOMED matching problem.
YAM++ provided the best results in terms of F-measure on both Task 3 and Task 4. YAM++ also provided the best Precision and Recall in Task 3 and Task 4, respectively; while AML-BK provided the best Recall in Task 3 and AML-R the best Precision in Task 4.
Overall, the results were less positive than in the FMA-NCI matching problem and only YAM++ obtained an F-measure greater than 0.80 in the two tasks. Furthermore, 9 systems failed to provide a recall higher than 0.4. Thus, matching FMA against SNOMED represents a significant leap in complexity with respect to the FMA-NCI matching problem.
As in the FMA-NCI matching problem, efficiency also decreases as the ontology size increases. The most important variations were suffered by SPHeRe, IAMA and GOMMA in terms of precision.
System | Time (s) | # Mappings | Scores | Incoherence Analysis | |||
Precision | Recall | F-measure | Unsat. | Degree | |||
YAM++ | 100 | 6,635 | 0.982 | 0.729 | 0.836 | 13,040 | 55.3% |
AML-BK | 93 | 6,937 | 0.942 | 0.731 | 0.824 | 12,379 | 52.5% |
AML | 60 | 6,822 | 0.943 | 0.720 | 0.816 | 15,244 | 64.7% |
AML-BK-R | 122 | 6,554 | 0.950 | 0.696 | 0.803 | 15 | 0.06% |
AML-R | 86 | 6,459 | 0.949 | 0.686 | 0.796 | 14 | 0.06% |
LogMap-BK | 85 | 6,242 | 0.963 | 0.672 | 0.792 | 0 | 0.0% |
LogMap | 79 | 6,071 | 0.966 | 0.656 | 0.782 | 0 | 0.0% |
ServOMap | 391 | 5,828 | 0.955 | 0.622 | 0.753 | 6,018 | 25.5% |
ODGOMS-v1.2 | 42,909 | 5,918 | 0.862 | 0.570 | 0.686 | 9,176 | 38.9% |
Average | 8,073 | 4,248 | 0.892 | 0.436 | 0.549 | 7,308 | 31.0% |
GOMMA2012 | 54 | 3,666 | 0.924 | 0.379 | 0.537 | 2,058 | 8.7% |
ODGOMS-v1.1 | 27,451 | 2,267 | 0.876 | 0.222 | 0.354 | 938 | 4.0% |
HotMatch | 32,244 | 2,139 | 0.872 | 0.209 | 0.337 | 907 | 3.9% |
LogMapLt | 15 | 1,645 | 0.973 | 0.179 | 0.302 | 773 | 3.3% |
Hertuda | 17,610 | 3,051 | 0.575 | 0.196 | 0.293 | 1,020 | 4.3% |
SPHeRe | 154 | 1,577 | 0.916 | 0.162 | 0.275 | 805 | 3.4% |
IAMA | 27 | 1,250 | 0.962 | 0.134 | 0.236 | 22,925 | 97.3% |
XMapGen | 12,127 | 1,827 | 0.694 | 0.142 | 0.236 | 23,217 | 98.5% |
XMapSiG | 11,720 | 1,581 | 0.760 | 0.134 | 0.228 | 23,025 | 97.7% |
System | Time (s) | # Mappings | Scores | Incoherence Analysis | |||
Precision | Recall | F-measure | Unsat. | Degree | |||
YAM++ | 402 | 6,842 | 0.947 | 0.725 | 0.821 | ≥57,074 | ≥28.3% |
AML-BK | 530 | 6,186 | 0.937 | 0.648 | 0.766 | ≥40,162 | ≥19.9% |
AML | 542 | 5,797 | 0.963 | 0.624 | 0.758 | ≥39,472 | ≥19.6% |
AML-BK-R | 584 | 5,858 | 0.941 | 0.617 | 0.745 | 29 | 0.01% |
AML-R | 554 | 5,499 | 0.966 | 0.594 | 0.736 | 7 | 0.004% |
ServOMap | 4,059 | 6,440 | 0.861 | 0.620 | 0.721 | ≥164,116 | ≥81.5% |
LogMap-BK | 556 | 6,134 | 0.874 | 0.600 | 0.711 | 0 | 0.0% |
LogMap | 537 | 5,923 | 0.888 | 0.588 | 0.708 | 0 | 0.0% |
Average | 2,448 | 5,007 | 0.835 | 0.479 | 0.588 | 40,143 | 19.9% |
GOMMA2012 | 634 | 5,648 | 0.406 | 0.257 | 0.315 | 9,918 | 4.9% |
LogMapLt | 101 | 1,823 | 0.878 | 0.179 | 0.297 | ≥4,393 | ≥2.2% |
SPHeRe | 20,664 | 2,338 | 0.614 | 0.160 | 0.254 | 6,523 | 3.2% |
IAMA | 218 | 1,600 | 0.749 | 0.134 | 0.227 | ≥160,022 | ≥79.4% |