wiki:GraphMatchingResults

Version 4 (modified by spillman, 18 years ago) (diff)

--

TracNav?

Ontology Matching Using Graph Matching --- Results

The following table shows the precision and recall values for the Benchmark tests, as well as the Fallout, the F-Measure and the Overall measure. The algorithm applied is the the MaxComSubgraphAlgorithm with a VertexTypeEquality measure (two vertices are equal if they are both concepts or relations or attributes) and a Longest Common Subsequence (lcs) similarity measure for the vertices' labels. These two measures are concatenated with a conjunction.

algo mxcs_0_0int mxcs_0_1int mxcs_0_2int mxcs_0_3int
test Prec. Rec. Fall. FMeas. Over. Prec. Rec. Fall. FMeas. Over. Prec. Rec. Fall. FMeas. Over. Prec. Rec. Fall. FMeas. Over.
101 1.00 0.99 0.00 0.99 0.99 1.00 0.99 0.00 0.99 0.99 0.99 0.98 0.01 0.98 0.97 0.96 0.95 0.04 0.95 0.91
102 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 0.00 NaN 1.00 NaN NaN
103 1.00 0.93 0.00 0.96 0.93 1.00 0.94 0.00 0.97 0.94 0.99 0.93 0.01 0.96 0.92 0.94 0.88 0.06 0.91 0.82
104 1.00 0.92 0.00 0.96 0.92 1.00 0.92 0.00 0.96 0.92 0.99 0.91 0.01 0.95 0.90 n/a n/a 1.00 0.76 0.00
201 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
202 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
203 1.00 1.00 0.00 1.00 1.00 1.00 1.00 0.00 1.00 1.00 0.99 0.98 0.01 0.98 0.97 0.96 0.95 0.04 0.95 0.91
204 1.00 0.63 0.00 0.77 0.63 1.00 0.64 0.00 0.78 0.64 0.99 0.77 0.01 0.87 0.76 0.96 0.78 0.04 0.86 0.75
205 1.00 0.16 0.00 0.28 0.16 1.00 0.18 0.00 0.30 0.18 0.90 0.20 0.10 0.32 0.18 0.71 0.21 0.29 0.32 0.12
206 1.00 0.20 0.00 0.33 0.20 1.00 0.23 0.00 0.37 0.23 0.97 0.39 0.03 0.56 0.38 0.93 0.44 0.07 0.60 0.41
207 1.00 0.20 0.00 0.33 0.20 1.00 0.23 0.00 0.37 0.23 0.97 0.39 0.03 0.56 0.38 0.93 0.44 0.07 0.60 0.41
208 1.00 0.62 0.00 0.76 0.62 1.00 0.64 0.00 0.78 0.64 0.99 0.76 0.01 0.86 0.75 0.96 0.77 0.04 0.86 0.74
209 1.00 0.16 0.00 0.28 0.16 1.00 0.18 0.00 0.30 0.18 0.90 0.20 0.10 0.32 0.18 0.71 0.21 0.29 0.32 0.12
210 1.00 0.16 0.00 0.28 0.16 1.00 0.16 0.00 0.28 0.16 1.00 0.16 0.00 0.28 0.16 1.00 0.16 0.00 0.28 0.16
221 1.00 0.92 0.00 0.96 0.92 0.99 0.90 0.01 0.94 0.89 0.98 0.89 0.02 0.93 0.87 n/a n/a 1.00 0.77 0.00
222 1.00 0.97 0.00 0.98 0.97 0.99 0.95 0.01 0.97 0.94 0.98 0.94 0.02 0.96 0.91 n/a n/a 1.00 0.81 0.00
223 1.00 0.94 0.00 0.97 0.94 0.99 0.95 0.01 0.97 0.94 0.98 0.93 0.02 0.95 0.91 n/a n/a 0.97 0.79 0.03
224 1.00 0.99 0.00 0.99 0.99 1.00 1.00 0.00 1.00 1.00 0.99 0.99 0.01 0.99 0.98 0.96 0.95 0.04 0.95 0.91
225 1.00 0.99 0.00 0.99 0.99 1.00 0.99 0.00 0.99 0.99 0.99 0.99 0.01 0.99 0.98 0.96 0.96 0.04 0.96 0.92
228 1.00 1.00 0.00 1.00 1.00 0.94 0.88 0.06 0.91 0.82 0.94 0.88 0.06 0.91 0.82 0.90 0.85 0.10 0.88 0.76
230 1.00 0.97 0.00 0.99 0.97 0.97 0.94 0.03 0.96 0.92 0.96 0.93 0.04 0.94 0.89 0.89 0.86 0.11 0.87 0.75
231 1.00 0.99 0.00 0.99 0.99 1.00 0.99 0.00 0.99 0.99 0.99 0.98 0.01 0.98 0.97 0.96 0.96 0.04 0.96 0.92
232 1.00 0.92 0.00 0.96 0.92 0.99 0.90 0.01 0.94 0.89 0.98 0.89 0.02 0.93 0.87 n/a n/a 1.00 0.77 0.00
233 1.00 0.79 0.00 0.88 0.79 0.92 0.73 0.08 0.81 0.67 0.92 0.73 0.08 0.81 0.67 0.85 0.70 0.15 0.77 0.58
236 1.00 1.00 0.00 1.00 1.00 0.94 0.88 0.06 0.91 0.82 0.94 0.88 0.06 0.91 0.82 0.90 0.85 0.10 0.88 0.76
237 1.00 0.97 0.00 0.98 0.97 0.99 0.96 0.01 0.97 0.95 0.98 0.95 0.02 0.96 0.92 n/a n/a 1.00 0.80 0.00
238 1.00 0.95 0.00 0.97 0.95 0.99 0.94 0.01 0.96 0.93 0.98 0.94 0.02 0.96 0.92 n/a n/a 0.97 0.78 0.03
239 0.97 0.97 0.03 0.97 0.93 0.90 0.90 0.10 0.90 0.79 0.90 0.90 0.10 0.90 0.79 0.83 0.83 0.17 0.83 0.66
240 0.97 0.97 0.03 0.97 0.94 0.90 0.85 0.10 0.88 0.76 0.90 0.85 0.10 0.88 0.76 0.71 0.67 0.29 0.69 0.39
241 1.00 0.79 0.00 0.88 0.79 0.92 0.73 0.08 0.81 0.67 0.92 0.73 0.08 0.81 0.67 0.85 0.70 0.15 0.77 0.58
246 0.97 0.97 0.03 0.97 0.93 0.90 0.90 0.10 0.90 0.79 0.90 0.90 0.10 0.90 0.79 0.83 0.83 0.17 0.83 0.66
247 0.97 0.97 0.03 0.97 0.94 0.90 0.85 0.10 0.88 0.76 0.90 0.85 0.10 0.88 0.76 0.71 0.67 0.29 0.69 0.39
248 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
249 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
250 NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN
251 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
252 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
253 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
254 NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN
257 NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN
258 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
259 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 1.00 0.01 0.00 0.02 0.01 0.50 0.01 0.50 0.02 0.00
260 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN
261 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN
262 NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN NaN 0.00 NaN NaN NaN
265 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN
266 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN 0.00 0.00 1.00 NaN NaN
301 0.89 0.13 0.11 0.23 0.11 0.83 0.16 0.17 0.27 0.13 0.82 0.23 0.18 0.36 0.18 0.71 0.33 0.29 0.45 0.20
302 1.00 0.52 0.00 0.68 0.52 1.00 0.52 0.00 0.68 0.52 1.00 0.52 0.00 0.68 0.52 1.00 0.52 0.00 0.68 0.52
303 0.94 0.69 0.06 0.81 0.65 0.92 0.69 0.08 0.79 0.63 0.94 0.69 0.06 0.81 0.65 0.92 0.69 0.08 0.79 0.63
304 1.00 0.55 0.00 0.71 0.55 0.93 0.51 0.07 0.66 0.47 0.93 0.53 0.07 0.67 0.49 n/a n/a 0.96 0.68 0.04
H-mean 0.99 0.51 0.01 0.67 0.50 0.98 0.50 0.02 0.66 0.49 0.97 0.52 0.03 0.67 0.50 0.90 0.33 0.10 0.48 0.29

Attachments (9)

Download all attachments as: .zip