Ontology Alignment Evaluation Initiative - OAEI-2015 Campaign

Results OAEI 2015::Large BioMed Track

Summary Results (top systems)

The following table summarises the results for the systems that completed all 6 tasks of the Large BioMed Track. The table shows the total time in seconds to complete all tasks and averages for Precision, Recall, F-measure and Incoherence degree. The systems have been ordered according to the average F-measure and Incoherence degree.

AML and XMAP-BK were a step ahead and obtained the best average Recall and F-measure.

RSDLWB and LogMapC were the best systems in terms of precision.

Regarding mapping incoherence, AML and LogMap variants (excluding LogMapLite) computed mapping sets leading to very small number of unsatisfiable classes.

Finally, LogMapLt and RSDLWB were the fastest system. Total computation times were slightly higher this year than previous years due to the (extra) overload of downloading the ontologies from the new SEALS repository.

* Uses background knowledge based on the UMLS-Metathesaurus as the LargeBio reference alignments.


System Total Time (s) Average
Precision  Recall  F-measure Incoherence
AML 1,940 0.905 0.754 0.819 0.0046%
XMAP-BK * 2,520 0.904 0.764 0.819 16.6%
LogMap 2,608 0.903 0.714 0.794 0.0053%
LogMapBio 13,711 0.867 0.733 0.789 0.0053%
XMAP 2,371 0.892 0.654 0.751 15.8%
LogMapC 8,618 0.907 0.551 0.682 0.0125%
LogMapLite 1,323 0.868 0.532 0.613 33.9%
RSDLWB 1,334 0.923 0.236 0.367 16.6%