SemTab 2020: Semantic Web Challenge on Tabular Data to Knowledge Graph Matching
Rounds 1-3 of SemTab 2020 were run with the support of AICrowd and relied on an automatic dataset generator. Round 4 was a blind round combining:
(1) an automatically generated (AG) dataset, and (2) the Tough Tables (2T) dataset (CEA and CTA).
Overview
Participation
9 (core) participating systems produced results across rounds and tasks.
Participants |
18 |
16 |
18 |
10 |
9 |
CEA |
10 |
10 |
9 |
9 |
9 |
CTA |
15 |
13 |
16 |
9 |
8 |
CPA |
9 |
11 |
8 |
7 |
- |
Table 1: Number of participants in the challenge.
Results overview
The Tough Tables dataset brings an interesting complexity as the Table 2 shows below.
See summary results in the SemTab-2020 slides presented during the ISWC conference.
CEA |
0.93 |
0.95 |
0.94 |
0.92 |
0.54 |
CTA |
0.83 |
0.93 |
0.94 |
0.92 |
0.59 |
CPA |
0.93 |
0.97 |
0.93 |
0.96 |
- |
Table 2: Average F1-score of top-10 systems (discarding outliers).
Round 4 - 2T Results
CEA Task
MTab4Wikidata |
0.907 |
0.907 |
bbw |
0.863 |
0.927 |
LinkingPark |
0.81 |
0.811 |
LexMa |
0.587 |
0.795 |
Team_DAGOBAH |
0.412 |
0.749 |
Unimib/MantisTable/MantisTable |
0.4 |
0.804 |
JenTab |
0.374 |
0.541 |
AMALGAM |
0.323 |
0.553 |
SSL |
0.198 |
0.198 |
Table 3: CEA (Round 4 - 2T) results.
CTA Task
MTab4Wikidata |
0.728 |
0.73 |
Team_DAGOBAH |
0.718 |
0.747 |
LinkingPark |
0.686 |
0.687 |
JenTab |
0.624 |
0.669 |
AMALGAM |
0.606 |
0.608 |
bbw |
0.516 |
0.789 |
Unimib/MantisTable/MantisTable |
0.474 |
0.639 |
SSL |
0.363 |
0.363 |
Table 4: CTA (Round 4 - 2T) results.
Round 4 Results
CEA Task
MTab4Wikidata |
0.993 |
0.993 |
LinkingPark |
0.985 |
0.985 |
Team_DAGOBAH |
0.984 |
0.985 |
bbw |
0.978 |
0.984 |
JenTab |
0.973 |
0.975 |
AMALGAM |
0.892 |
0.914 |
LexMa |
0.845 |
0.911 |
SSL |
0.833 |
0.833 |
Unimib/MantisTable |
0.812 |
0.985 |
Table 5: CEA (Round 4) results.
CTA Task
MTab4Wikidata |
0.981 |
0.982 |
bbw |
0.98 |
0.98 |
Team_DAGOBAH |
0.972 |
0.972 |
LinkingPark |
0.953 |
0.953 |
SSL |
0.946 |
0.946 |
JenTab |
0.93 |
0.93 |
AMALGAM |
0.858 |
0.861 |
Unimib/MantisTable |
0.725 |
0.989 |
Kepler-aSI |
0.253 |
0.676 |
Table 6: CTA (Round 4) results.
CPA Task
MTab4Wikidata |
0.997 |
0.997 |
bbw |
0.995 |
0.996 |
Team_DAGOBAH |
0.995 |
0.995 |
JenTab |
0.994 |
0.994 |
LinkingPark |
0.985 |
0.988 |
SSL |
0.924 |
0.924 |
Unimib/MantisTable |
0.803 |
0.988 |
Table 7: CPA (Round 4) results.
Round 3 Results
CEA Task
MTab4Wikidata |
0.991 |
0.992 |
LinkingPark |
0.986 |
0.986 |
Team_DAGOBAH |
0.985 |
0.985 |
Unimib/MantisTable |
0.974 |
0.979 |
bbw |
0.954 |
0.974 |
JenTab |
0.935 |
0.952 |
SSL |
0.906 |
0.906 |
AMALGAM |
0.877 |
0.892 |
LexMa |
0.863 |
0.907 |
Table 8: CEA (Round 3) results.
CTA Task
LinkingPark |
0.978 |
0.979 |
MTab4Wikidata |
0.976 |
0.976 |
Team_DAGOBAH |
0.974 |
0.974 |
Unimib/MantisTable |
0.958 |
0.965 |
SSL |
0.913 |
0.913 |
AMALGAM |
0.869 |
0.873 |
bbw |
0.96 |
0.966 |
JenTab |
0.859 |
0.863 |
Kepler-aSI |
0.275 |
0.701 |
W_B_A_95 |
0.273 |
0.701 |
baha |
0.252 |
0.705 |
C8T8A |
0.244 |
0.708 |
wael |
0.212 |
0.692 |
W_B_2020 |
0.208 |
0.693 |
ZM889 |
0.202 |
0.696 |
B_w_2020 |
0.192 |
0.698 |
Table 9: CTA (Round 3) results.
CPA Task
MTab4Wikidata |
0.995 |
0.995 |
Team_DAGOBAH |
0.993 |
0.994 |
LinkingPark |
0.985 |
0.988 |
bbw |
0.949 |
0.957 |
Unimib/MantisTable |
0.941 |
0.957 |
JenTab |
0.917 |
0.928 |
TeamTR |
0.837 |
0.931 |
SSL |
0.815 |
0.815 |
Table 10: CPA (Round 3) results.
Round 2 Results
CEA Task
10 systems produced results in the CEA task.
MTab4Wikidata |
0.995 |
0.995 |
Team_DAGOBAH |
0.993 |
0.993 |
LinkingPark |
0.993 |
0.993 |
Unimib/MantisTable |
0.991 |
0.993 |
SSL |
0.961 |
0.961 |
JenTab |
0.953 |
0.969 |
AMALGAM |
0.921 |
0.927 |
tacko |
0.920 |
0.934 |
LexMa |
0.915 |
0.927 |
bbw |
0.865 |
0.865 |
Table 11: CEA (Round 2) results.
CTA Task
13 systems produced results in the CTA task.
LinkingPark |
0.984 |
0.985 |
MTab4Wikidata |
0.984 |
0.984 |
Team_DAGOBAH |
0.983 |
0.983 |
Unimib/MantisTable |
0.966 |
0.973 |
SSL |
0.966 |
0.966 |
AMALGAM |
0.926 |
0.928 |
bbw |
0.914 |
0.929 |
JenTab |
0.904 |
0.906 |
tacko |
0.857 |
0.916 |
ahmad_alobaid |
0.834 |
0.834 |
Kepler-aSI |
0.295 |
0.784 |
baha |
0.294 |
0.783 |
wael |
0.281 |
0.781 |
Table 12: CTA (Round 2) results.
CPA Task
11 systems produced results in the CTA task.
MTab4Wikidata |
0.997 |
0.997 |
LinkingPark |
0.993 |
0.994 |
Team_DAGOBAH |
0.992 |
0.994 |
bbw |
0.991 |
0.992 |
SSL |
0.973 |
0.973 |
Unimib/MantisTable |
0.961 |
0.966 |
JenTab |
0.955 |
0.965 |
magwari |
0.916 |
0.935 |
tacko |
0.892 |
0.946 |
TeamTR |
0.873 |
0.932 |
ahmad_alobaid |
0.611 |
0.611 |
Table 13: CPA (Round 2) results.
Round 1 Results
CEA Task
10 systems produced results in the CEA task.
MTab4Wikidata |
0.987 |
0.988 |
LinkingPark |
0.987 |
0.988 |
Unimib/MantisTable (*) |
0.982 |
0.989 |
tacko |
0.954 |
0.962 |
SSL |
0.936 |
0.936 |
Team_DAGOBAH |
0.922 |
0.944 |
AMALGAM |
0.913 |
0.914 |
LexMa |
0.909 |
0.913 |
Random/chalerislin |
0.865 |
0.865 |
JenTab |
0.828 |
0.906 |
Table 14: CEA (Round 1) results. (*)Submission after the deadline.
CTA Task
15 systems produced results in the CTA task.
LinkingPark |
0.926 |
0.926 |
MTab4Wikidata |
0.885 |
0.884 |
tacko |
0.875 |
0.891 |
SSL |
0.861 |
0.860 |
fresh2020 |
0.858 |
0.858 |
Team_DAGOBAH |
0.834 |
0.854 |
Random/chalerislin |
0.815 |
0.815 |
SSLteam5 |
0.768 |
0.768 |
Unimib/MantisTable |
0.746 |
0.753 |
Team_4 |
0.739 |
0.739 |
MapleSyrup |
0.739 |
0.739 |
AMALGAM |
0.724 |
0.717 |
mitloehn |
0.643 |
0.643 |
LexMa |
0.638 |
0.734 |
JenTab |
0.574 |
0.626 |
Table 15: CTA (Round 1) results.
CPA Task
9 systems produced results in the CTA task.
MTab4Wikidata |
0.971 |
0.991 |
LinkingPark |
0.967 |
0.978 |
magwari |
0.959 |
0.960 |
SSL |
0.943 |
0.943 |
tacko |
0.918 |
0.932 |
TeamTR |
0.916 |
0.916 |
Team_DAGOBAH |
0.914 |
0.962 |
Unimib/MantisTable (*) |
0.888 |
0.942 |
SAAS_ANU |
0.120 |
0.120 |
Table 16: CPA (Round 1) results. (*)Submission after the deadline.
Acknowledgements
The challenge is currently supported by the SIRIUS Centre for Research-driven Innovation and IBM Research.