News
Oct 31st, 2017
The results are now available. Oct 30th, 2017
The solutions are now available. Sept 8th, 2017
Update for Challenge 15 available, but will not count in evaluation. Sept 4th, 2017
Updated mailling list and submission information. Aug 23rd, 2017
The preliminary results have been sent out to participants, and are now available. July 09th, 2017
We fixed the intensities in the TSV archive for challenges 046-243. June 22nd, 2017
We added the Category 4 on a subset of the data files. May 22nd, 2017
We have improved challenges 29, 42, 71, 89, 105, 106 and 144. April 26th, 2017
The rules and challenges of CASMI 2017 are public now ! Jan 20th, 2017
Organisation of CASMI 2017 is underway, stay tuned!
Oct 31st, 2017
The results are now available. Oct 30th, 2017
The solutions are now available. Sept 8th, 2017
Update for Challenge 15 available, but will not count in evaluation. Sept 4th, 2017
Updated mailling list and submission information. Aug 23rd, 2017
The preliminary results have been sent out to participants, and are now available. July 09th, 2017
We fixed the intensities in the TSV archive for challenges 046-243. June 22nd, 2017
We added the Category 4 on a subset of the data files. May 22nd, 2017
We have improved challenges 29, 42, 71, 89, 105, 106 and 144. April 26th, 2017
The rules and challenges of CASMI 2017 are public now ! Jan 20th, 2017
Organisation of CASMI 2017 is underway, stay tuned!
Results in Category 4
Summary of participant performance
| F1 score | Mean rank | Median rank | Top | Top3 | Top10 | Misses | TopPos | TopNeg | Mean RRP | Median RRP | N | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| kai_iso | 2306 | 658.28 | 5.0 | 66 | 91 | 119 | 0 | 46 | 20 | 0.913 | 0.999 | 198 |
| kai112 | 2117 | 498.20 | 6.0 | 55 | 85 | 116 | 0 | 38 | 17 | 0.939 | 0.999 | 198 |
| IOKRtransferAvgScore | 1402 | 2740.56 | 201.0 | 36 | 59 | 73 | 0 | 20 | 16 | 0.681 | 0.955 | 198 |
| IOKR_TanimotoGaussian | 1212 | 3691.67 | 393.0 | 33 | 50 | 61 | 0 | 21 | 12 | 0.613 | 0.844 | 198 |
| IOKRtransfer | 1172 | 3733.52 | 396.0 | 31 | 48 | 60 | 0 | 19 | 12 | 0.609 | 0.832 | 198 |
| yuanyuesimple | 718 | 192.59 | 31.5 | 10 | 23 | 58 | 0 | 5 | 5 | 0.973 | 0.990 | 198 |
| yuanyuesqrt | 681 | 160.84 | 33.5 | 10 | 24 | 53 | 0 | 7 | 3 | 0.974 | 0.990 | 198 |
| metfrag_plus | 634 | 1750.18 | 142.0 | 6 | 21 | 51 | 0 | 2 | 4 | 0.811 | 0.966 | 198 |
| metfrag | 523 | 749.61 | 93.2 | 7 | 15 | 42 | 0 | 4 | 3 | 0.914 | 0.980 | 198 |
| yuanyuelogsum | 305 | 1248.99 | 253.2 | 5 | 12 | 22 | 0 | 4 | 1 | 0.833 | 0.956 | 198 |
| Rakesh | 301 | 607.01 | 135.5 | 0 | 2 | 19 | 2 | 0 | 0 | 0.904 | 0.962 | 198 |
Table legend:
- F1 score
- The Formula 1 score awards points similar to the scheme in F1 racing for each challenge based on the rank of the correct solution. In the participant table, these are summed over all challenges. Please note that the F1 score is thus not neccessarily comparable across categories.
- Mean/Median rank
- Mean and median rank of the correct solution. For tied ranks with other candidates, the average rank of the ties is used.
- Top, Top3, Top10
- Number of challenges where the correct solution is ranked first, among the Top 3 and Top 10
- Misses
- Number of challenges where the correct solution is missing.
- TopPos, TopNeg
- Top1 ranked solutions in positive or negative ionization mode.
- Mean/Median RRP
- The relative ranking position, which is also incorporating the length of candidate list.
- N
- Number of submissions that have passed the evaluation scripts.
Summary of Rank by Challenge
For each challenge, the lowest rank among participants is highlighted in bold. If the submission did not contain the correct candidate this is denoted as "-". If someone did not participate in a challenge, the table cell is empty. The tables are sortable if you click into the column header. Category4:| IOKR_TanimotoGaussian | IOKRtransfer | IOKRtransferAvgScore | kai_iso | kai112 | metfrag_plus | metfrag | Rakesh | yuanyuelogsum | yuanyuesimple | yuanyuesqrt | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| challenge-046 | 1620.0 | 1616.0 | 1633.0 | 950.0 | 1231.0 | 333.0 | 347.0 | 816.0 | 907.0 | 6.0 | 4.0 |
| challenge-047 | 387.0 | 393.0 | 383.0 | 10.0 | 7.0 | 130.5 | 187.5 | 10.0 | 322.0 | 8.0 | 12.0 |
| challenge-048 | 6626.0 | 6615.0 | 544.0 | 1316.0 | 3448.0 | 1325.5 | 1370.0 | 736.5 | 155.0 | 135.0 | 97.0 |
| challenge-049 | 1769.5 | 1730.5 | 520.5 | 642.5 | 317.5 | 1662.5 | 2426.0 | 760.0 | 851.0 | 344.0 | 681.0 |
| challenge-050 | 2.0 | 2.0 | 2.0 | 2.0 | 3.0 | 15.0 | 6.0 | 77.5 | 32.0 | 12.0 | 10.0 |
| challenge-051 | 2.0 | 6.0 | 1.0 | 23.0 | 37.0 | 17.5 | 17.5 | 5.5 | 96.0 | 5.0 | 6.0 |
| challenge-052 | 279.0 | 287.0 | 1304.0 | 99.0 | 25.0 | 228.0 | 521.5 | 674.0 | 2382.0 | 326.0 | 311.0 |
| challenge-053 | 1.0 | 1.0 | 2.0 | 1659.5 | 1.0 | 15.5 | 5.5 | 8.0 | 38.0 | 10.0 | 8.0 |
| challenge-054 | 8690.0 | 8258.0 | 11290.0 | 480.0 | 510.0 | 9.0 | 6.0 | 18.5 | 1896.0 | 36.0 | 53.0 |
| challenge-055 | 4.0 | 3.0 | 1.0 | 1.0 | 136.0 | 165.0 | 78.0 | 4.5 | 66.0 | 8.0 | 8.0 |
| challenge-056 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 608.0 | 224.0 | 1118.5 | 186.0 | 247.0 | 215.0 |
| challenge-057 | 15798.0 | 15807.0 | 15756.0 | 10074.0 | 403.0 | 7.0 | 6.0 | 12.0 | 8153.0 | 3318.0 | 117.0 |
| challenge-058 | 14.0 | 19.0 | 637.0 | 100.0 | 119.0 | 373.0 | 572.0 | 1712.0 | 709.0 | 143.0 | 114.0 |
| challenge-059 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 7.5 | 6.5 | 62.5 | 1.5 | 1.5 | 1.5 |
| challenge-060 | 11130.0 | 11388.0 | 8759.0 | 989.0 | 6749.0 | 1349.5 | 1323.5 | 2250.0 | 2786.0 | 510.0 | 391.0 |
| challenge-061 | 1053.0 | 1216.0 | 3166.0 | 65.0 | 32.0 | 7.5 | 192.0 | 298.5 | 1104.0 | 79.0 | 112.0 |
| challenge-062 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 137.0 | 223.0 | 1078.5 | 86.0 | 191.0 | 192.0 |
| challenge-063 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 8302.5 | 9232.5 | 136.5 | 388.0 | 113.0 | 137.0 |
| challenge-064 | 11184.0 | 11212.0 | 11237.0 | 5.0 | 10.0 | 6892.5 | 9354.5 | 39.5 | 11063.5 | 1.0 | 1.0 |
| challenge-065 | 2032.0 | 2055.0 | 1897.0 | 32.0 | 37.0 | 5.0 | 9.0 | 13.0 | 252.0 | 15.0 | 14.0 |
| challenge-066 | 11104.0 | 11332.0 | 11332.0 | 6553.5 | 34.0 | 2.0 | 29.0 | 648.5 | 89.0 | 28.0 | 33.0 |
| challenge-067 | 2669.0 | 2742.0 | 1168.0 | 213.0 | 114.0 | 1077.0 | 1045.0 | 760.0 | 2831.0 | 1193.0 | 1163.0 |
| challenge-068 | 9.5 | 8.5 | 8.5 | 6.5 | 90.5 | 1.5 | 17.5 | 29.5 | 2.0 | 4.0 | 3.0 |
| challenge-069 | 12.0 | 12.0 | 307.0 | 56.0 | 133.0 | 61.0 | 38.0 | 41.0 | 39.0 | 11.0 | 12.0 |
| challenge-070 | 10720.0 | 10879.0 | 11445.0 | 2841.0 | 5937.0 | 12036.0 | 7087.5 | 629.0 | 5955.0 | 3088.0 | 2793.0 |
| challenge-071 | 8802.0 | 8424.0 | 34.0 | 4.0 | 7.0 | 12.0 | 39.0 | 41.5 | 619.0 | 25.0 | 28.0 |
| challenge-072 | 15495.0 | 15365.0 | 15585.0 | 529.0 | 749.0 | 33.0 | 35.5 | 40.5 | 329.0 | 25.0 | 21.0 |
| challenge-073 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 4.0 | 15.0 | 11.5 | 1146.0 | 25.0 | 32.0 |
| challenge-074 | 4.0 | 4.0 | 3.0 | 4.0 | 5.0 | 67.0 | 164.0 | 15.0 | 2.0 | 11.5 | 13.0 |
| challenge-075 | 513.0 | 582.0 | 1.0 | 99.0 | 202.0 | 1786.0 | 1786.0 | 1245.0 | 35.0 | 47.0 | 64.0 |
| challenge-076 | 15778.0 | 15739.0 | 15942.0 | 9417.0 | 192.0 | 74.0 | 1236.0 | 24.5 | 3960.0 | 3.0 | 2.0 |
| challenge-077 | 7149.0 | 7223.0 | 7151.0 | 1608.0 | 5616.0 | 209.5 | 208.0 | 820.0 | 883.0 | 366.0 | 121.0 |
| challenge-078 | 12822.0 | 12908.0 | 5070.0 | 575.0 | 7599.0 | 506.5 | 548.5 | 877.5 | 2210.0 | 213.0 | 182.0 |
| challenge-079 | 674.0 | 623.0 | 593.0 | 1043.5 | 3.0 | 10.0 | 10.0 | 70.0 | 3.0 | 4.0 | 3.0 |
| challenge-080 | 1.0 | 1.0 | 6635.0 | 1.0 | 1.0 | 141.0 | 56.0 | 88.0 | 287.0 | 167.0 | 118.0 |
| challenge-081 | 1.0 | 1.0 | 1.0 | 2.0 | 1.0 | 14.0 | 15.0 | 3.0 | 40.0 | 25.0 | 10.0 |
| challenge-082 | 39.0 | 36.0 | 14.0 | 19.0 | 18.0 | 2113.5 | 237.5 | 104.0 | 36.0 | 7.0 | 10.0 |
| challenge-083 | 467.5 | 521.5 | 1.5 | 2.5 | 2.5 | 13.0 | 52.5 | 96.0 | 9.0 | 2.0 | 2.0 |
| challenge-084 | 388.0 | 394.0 | 302.0 | 1012.0 | 4376.0 | 230.5 | 167.5 | 158.0 | 184.0 | 76.0 | 136.0 |
| challenge-085 | 9221.0 | 9219.0 | 9090.0 | 89.0 | 73.0 | 4.0 | 2.0 | 41.5 | 1.0 | 13.0 | 7.0 |
| challenge-086 | 25067.0 | 25831.0 | 196.0 | 556.0 | 233.0 | 51.0 | 131.0 | 926.0 | 64.0 | 50.0 | 61.0 |
| challenge-087 | 1.0 | 1.0 | 1.0 | 2.0 | 26.0 | 5929.0 | 8341.0 | 967.0 | 2070.0 | 4.0 | 4.0 |
| challenge-088 | 3498.5 | 3500.5 | 3524.5 | 35.5 | 113.5 | 563.0 | 563.0 | 77.5 | 3281.0 | 1.5 | 1.5 |
| challenge-089 | 4519.0 | 4549.0 | 4220.0 | 1.0 | 1.0 | 83.0 | 10.0 | 18.0 | 120.0 | 20.0 | 24.0 |
| challenge-090 | 240.0 | 249.0 | 198.0 | 71.0 | 676.0 | 178.5 | 75.5 | 68.5 | 20.0 | 25.0 | 30.0 |
| challenge-091 | 686.0 | 677.0 | 699.0 | 1.0 | 22.0 | 449.5 | 449.5 | 31.5 | 33.0 | 3.0 | 3.0 |
| challenge-092 | 345.0 | 327.0 | 25.0 | 5.0 | 1.0 | 5.0 | 7.0 | 35.0 | 252.5 | 10.0 | 5.0 |
| challenge-093 | 3729.0 | 3735.0 | 3727.0 | 1.0 | 1.0 | 13.0 | 43.0 | 38.5 | 265.0 | 197.0 | 116.0 |
| challenge-094 | 11794.0 | 12764.0 | 6.0 | 21.0 | 21.0 | 317.0 | 1605.0 | 238.0 | 7063.0 | 25.0 | 29.0 |
| challenge-095 | 9459.0 | 9427.0 | 9209.0 | 1146.0 | 28.0 | 3.5 | 11.5 | 288.5 | 172.0 | 383.0 | 365.0 |
| challenge-096 | 212.0 | 104.0 | 5179.0 | 866.0 | 13073.0 | 2180.0 | 2342.0 | 3015.0 | 309.0 | 701.0 | 616.0 |
| challenge-097 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 2.0 | 3.0 | 35.5 | 283.0 | 19.0 | 32.0 |
| challenge-098 | 3055.0 | 3066.0 | 3028.0 | 1.0 | 4.0 | 2271.0 | 1589.0 | 31.5 | 1577.0 | 117.0 | 148.0 |
| challenge-099 | 10071.0 | 10075.0 | 10043.0 | 6480.0 | 206.0 | 9.0 | 21.0 | 35.0 | 7091.0 | 1.0 | 2.0 |
| challenge-100 | 7160.0 | 7238.0 | 7587.0 | 1349.0 | 1936.0 | 653.0 | 869.0 | 1118.5 | 341.0 | 530.0 | 471.0 |
| challenge-101 | 12846.0 | 12958.0 | 12441.0 | 8312.5 | 12.0 | 921.0 | 714.5 | 241.5 | 1791.0 | 1.0 | 1.0 |
| challenge-102 | 90.0 | 46.0 | 4940.0 | 5.5 | 12.0 | 92.0 | 270.5 | 51.0 | 68.0 | 11.0 | 19.0 |
| challenge-103 | 1.0 | 1.0 | 1.0 | 14.0 | 2.0 | 7.5 | 18.5 | 282.0 | 358.0 | 284.0 | 461.0 |
| challenge-104 | 4345.0 | 4270.0 | 43.0 | 41.0 | 111.0 | 273.0 | 18.5 | 534.0 | 598.0 | 493.0 | 568.0 |
| challenge-105 | 58.0 | 54.0 | 34.0 | 4.0 | 2.0 | 23.5 | 108.0 | 49.0 | 23.0 | 11.0 | 11.0 |
| challenge-106 | 1816.0 | 1279.0 | 8302.0 | 8.0 | 8.0 | 2.5 | 3.5 | 12.5 | 13.0 | 1.0 | 1.0 |
| challenge-107 | 5639.0 | 5629.0 | 5532.0 | 141.0 | 116.0 | 603.5 | 3142.5 | 62.0 | 4298.0 | 12.0 | 11.0 |
| challenge-108 | 417.0 | 882.0 | 4.0 | 3.0 | 2.0 | 1.5 | 287.5 | 331.5 | 89.0 | 33.0 | 31.0 |
| challenge-109 | 9252.0 | 9423.0 | 9287.0 | 89.0 | 120.0 | 24.0 | 26.0 | 249.5 | 1818.0 | 59.0 | 60.0 |
| challenge-110 | 6181.0 | 6253.0 | 6647.0 | 471.0 | 979.0 | 145.5 | 428.0 | 3115.0 | 1970.0 | 53.0 | 38.0 |
| challenge-111 | 3.0 | 3.0 | 2.0 | 1.0 | 32.0 | 192.0 | 567.5 | 360.0 | 106.0 | 125.0 | 148.0 |
| challenge-112 | 7377.0 | 7275.0 | 894.0 | 1088.0 | 7219.0 | 52.0 | 39.0 | 101.5 | 513.0 | 93.0 | 161.0 |
| challenge-113 | 398.0 | 398.0 | 378.0 | 5.0 | 5.0 | 14.0 | 19.0 | 15.5 | 9.5 | 14.0 | 9.0 |
| challenge-114 | 9694.0 | 9699.0 | 7389.0 | 1.0 | 1.0 | 1.0 | 44.0 | 21.0 | 17.0 | 1.0 | 2.0 |
| challenge-115 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 143.0 | 65.0 | - | 11.0 | 31.0 | 122.0 |
| challenge-116 | 16.0 | 17.0 | 1.0 | 325.5 | 2.0 | 9.0 | 6.0 | 15.5 | 8.0 | 7.0 | 6.0 |
| challenge-117 | 3759.0 | 3763.0 | 2671.0 | 5.0 | 4.0 | 1.0 | 1.0 | 38.5 | 54.0 | 168.0 | 153.0 |
| challenge-118 | 7017.0 | 7005.0 | 6439.0 | 4370.5 | 63.0 | 37.0 | 1487.0 | 20.5 | 41.0 | 7.0 | 9.0 |
| challenge-119 | 10479.0 | 10356.0 | 9665.0 | 1996.0 | 3148.0 | 9.0 | 1.0 | 479.0 | 2559.0 | 319.0 | 349.0 |
| challenge-120 | 10106.0 | 9869.0 | 1.0 | 1.0 | 1.0 | 232.5 | 23.5 | 318.5 | 156.0 | 47.0 | 52.0 |
| challenge-121 | 36.0 | 22.0 | 329.0 | 3.0 | 4.0 | 1.0 | 1.5 | 6.0 | 10.0 | 13.0 | 11.0 |
| challenge-122 | 5577.0 | 5945.0 | 878.0 | 1.0 | 1.0 | 3248.0 | 3292.0 | 44.0 | 222.0 | 28.0 | 37.0 |
| challenge-123 | 13335.0 | 13363.0 | 11629.0 | 2187.0 | 5892.0 | 6922.5 | 731.5 | 789.0 | 1357.0 | 226.0 | 257.0 |
| challenge-124 | 1390.0 | 1397.0 | 1357.0 | 1.0 | 3.0 | 1.5 | 29.5 | 6.5 | 4.0 | 6.0 | 9.0 |
| challenge-125 | 2559.5 | 2595.5 | 1915.5 | 14.5 | 7.5 | 55.5 | 76.5 | 47.0 | 60.5 | 36.0 | 17.0 |
| challenge-126 | 6.0 | 4.0 | 28.0 | 5.0 | 122.0 | 1.0 | 1.0 | 21.0 | 3.0 | 6.0 | 21.0 |
| challenge-127 | 272.0 | 157.0 | 551.0 | 6.0 | 14.0 | 4.0 | 5.0 | 7.5 | 2.0 | 2.0 | 2.0 |
| challenge-128 | 3.0 | 3.0 | 7539.0 | 1.0 | 4.0 | 9.0 | 11.5 | 71.5 | 52.0 | 193.0 | 119.0 |
| challenge-129 | 1843.0 | 1851.0 | 1.0 | 1.0 | 1.0 | 6.5 | 8.5 | 143.5 | 1493.0 | 6.0 | 24.0 |
| challenge-130 | 86.0 | 85.0 | 71.0 | 5.0 | 9.0 | 20.0 | 58.0 | 8.5 | 31.0 | 14.0 | 15.0 |
| challenge-131 | 386.0 | 386.0 | 357.0 | 5.0 | 8.0 | 14.5 | 18.5 | 10.0 | 87.0 | 10.0 | 12.0 |
| challenge-132 | 2304.0 | 2310.0 | 1345.0 | 201.0 | 358.0 | 161.5 | 161.5 | 331.0 | 1294.0 | 90.0 | 65.0 |
| challenge-133 | 8.0 | 7.0 | 4.0 | 1.0 | 1.0 | 11.5 | 5.5 | 10.0 | 26.0 | 12.0 | 13.0 |
| challenge-134 | 12400.0 | 12321.0 | 1877.0 | 9.0 | 5.0 | 187.0 | 604.5 | 736.5 | 86.0 | 103.0 | 88.0 |
| challenge-135 | 608.5 | 563.5 | 570.5 | 297.5 | 283.5 | 2749.0 | 629.0 | 760.0 | 1211.0 | 171.0 | 178.0 |
| challenge-136 | 1.0 | 1.0 | 1.0 | 1.0 | 3.0 | 92.0 | 7.5 | 1712.0 | 89.0 | 98.0 | 105.0 |
| challenge-137 | 1.5 | 1.5 | 6.5 | 1.5 | 1.5 | 26.0 | 24.5 | 29.5 | 116.0 | 18.0 | 26.0 |
| challenge-138 | 2.0 | 3.0 | 33.0 | 8.0 | 104.0 | 7829.0 | 117.0 | 820.0 | 62.0 | 90.0 | 135.0 |
| challenge-139 | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 220.5 | 41.5 | 66.5 | 13.0 | 95.0 | 19.0 |
| challenge-140 | 3139.0 | 3308.0 | 3.0 | 10454.0 | 36.0 | 466.0 | 2904.0 | 6137.5 | 2242.0 | 209.0 | 194.0 |
| challenge-141 | 1.0 | 1.0 | 1.0 | 1.0 | 27.0 | 7629.0 | 54.0 | 534.0 | 8.0 | 60.0 | 69.0 |
| challenge-142 | 3803.0 | 4111.0 | 413.0 | 1.0 | 1.0 | 1.0 | 552.0 | 21.0 | 3768.0 | 13.0 | 19.0 |
| challenge-143 | 5.0 | 8.0 | 65.0 | 6.0 | 32.0 | 4196.0 | 2264.0 | 1902.0 | 298.0 | 409.0 | 356.0 |
| challenge-144 | 2.0 | 3.0 | 31.0 | 18.0 | 17.0 | 762.0 | 248.5 | 605.5 | 1347.0 | 109.0 | 91.0 |
| challenge-145 | 1.0 | 1.0 | 29.0 | 1.0 | 2.0 | 4197.0 | 249.0 | 747.5 | 82.0 | 138.0 | 140.0 |
| challenge-146 | 11603.0 | 11883.0 | 12162.0 | 1.0 | 1.0 | 33.5 | 470.5 | 75.5 | 2665.0 | 5.0 | 5.0 |
| challenge-147 | 783.0 | 969.0 | 1.0 | 1039.0 | 3176.0 | 994.0 | 126.0 | 5298.0 | 133.0 | 7.0 | 9.0 |
| challenge-148 | 1.0 | 2.0 | 8.0 | 2.0 | 2.0 | 1.5 | 1016.5 | 6297.0 | 6906.0 | 27.0 | 17.0 |
| challenge-149 | 1.0 | 1.0 | 1.0 | 2.0 | 2.0 | 5.0 | 1.0 | 5.5 | 2.0 | 3.0 | 5.0 |
| challenge-150 | 851.0 | 1355.0 | 6.0 | 1.0 | 1.0 | 1090.5 | 5.5 | 8.0 | 283.0 | 8.0 | 8.0 |
| challenge-151 | 9844.0 | 9982.0 | 10365.0 | 5997.5 | 2.0 | 5541.0 | 95.0 | 18.5 | 8568.0 | 168.0 | 129.0 |
| challenge-152 | 60.0 | 58.0 | 63.0 | 3.0 | 2.0 | 3.0 | 2.0 | 4.5 | 1.0 | 1.0 | 1.0 |
| challenge-153 | 31.0 | 34.0 | 28.0 | 32.0 | 28.0 | 4641.0 | 387.0 | 1118.5 | 51.0 | 219.0 | 267.0 |
| challenge-154 | 8628.0 | 9312.0 | 10975.0 | 1.0 | 1.0 | 6749.0 | 40.0 | 13.0 | 36.0 | 12.0 | 17.0 |
| challenge-155 | 3.0 | 3.0 | 2.0 | 1.0 | 1.0 | 577.0 | 242.0 | 62.5 | 1364.5 | 26.5 | 26.5 |
| challenge-156 | 16.0 | 17.0 | 17.0 | 16.0 | 9.0 | 20.0 | 330.5 | 298.5 | 8231.0 | 23.5 | 101.5 |
| challenge-157 | 45.0 | 39.0 | 24.0 | 23.0 | 36.0 | 12946.0 | 496.0 | 2250.0 | 179.0 | 443.0 | 469.0 |
| challenge-158 | 6527.0 | 6751.0 | 3.0 | 4.0 | 3.0 | 16.5 | 958.5 | 298.5 | 36.5 | 4.0 | 8.0 |
| challenge-159 | 57.0 | 61.0 | 1.0 | 1.0 | 1.0 | 94.0 | 82.0 | 1078.5 | 115.0 | 90.0 | 160.0 |
| challenge-160 | 850.0 | 1317.0 | 12244.0 | 6866.0 | 1.0 | 6.0 | 12.0 | 39.5 | 3006.0 | 105.0 | 121.0 |
| challenge-161 | 240.0 | 276.0 | 277.0 | 587.0 | 2924.0 | 13192.0 | 172.0 | 571.0 | 1069.0 | 906.0 | 889.0 |
| challenge-162 | 17730.0 | 17762.0 | 1.0 | 1.0 | 1.0 | 4782.0 | 125.5 | 628.5 | 192.0 | 12.0 | 11.0 |
| challenge-163 | 1.0 | 1.0 | 1.0 | 1.0 | 2.0 | 4.0 | 32.5 | 1050.5 | 310.0 | 4.0 | 3.0 |
| challenge-164 | 1.0 | 1.0 | 3.0 | 1.0 | 1.0 | 8106.0 | 528.0 | 2250.0 | 560.0 | 727.0 | 672.0 |
| challenge-165 | 1995.0 | 1998.0 | 1990.0 | 1256.5 | 1.0 | 1124.0 | 16.0 | 13.0 | 858.0 | 40.0 | 48.0 |
| challenge-166 | 503.0 | 668.0 | 3720.0 | 1.0 | 2.0 | 1.5 | 35.5 | 648.5 | 2378.0 | 6.0 | 12.0 |
| challenge-167 | 16017.0 | 15822.0 | 14779.0 | 2.0 | 1328.0 | 4075.0 | 1579.0 | 6519.5 | 284.0 | 145.0 | 67.0 |
| challenge-168 | 1248.0 | 1505.0 | 9.0 | 2248.5 | 12.0 | 323.0 | 334.5 | 760.0 | 130.0 | 54.0 | 137.0 |
| challenge-169 | 136.0 | 145.0 | 33.0 | 1.0 | 11.0 | 32.0 | 25.0 | 64.0 | 28.0 | 28.0 | 14.0 |
| challenge-170 | 1.0 | 1.0 | 1.0 | 2.0 | 1.0 | 2476.0 | 92.0 | 1118.5 | 270.0 | 348.0 | 492.0 |
| challenge-171 | 1.0 | 1.0 | 3.0 | 3.0 | 3.0 | 364.0 | 85.0 | 832.5 | 45.0 | 167.0 | 218.0 |
| challenge-172 | 177.0 | 187.0 | 204.0 | 289.0 | 219.0 | 7428.5 | 2969.5 | 3839.5 | 277.0 | 277.0 | 251.0 |
| challenge-173 | 14759.0 | 14943.0 | 1371.0 | 1.0 | 9.0 | 2774.0 | 112.0 | 629.0 | 2193.0 | 1365.0 | 1779.0 |
| challenge-174 | 6758.0 | 6924.0 | 6538.0 | 7.0 | 8.0 | 14.0 | 956.0 | 41.5 | 5667.0 | 50.0 | 47.0 |
| challenge-175 | 23.0 | 23.0 | 1353.0 | 17.0 | 15.0 | 4094.0 | 21.0 | 40.5 | 108.0 | 13.0 | 12.0 |
| challenge-176 | 45.0 | 48.0 | 18.0 | 9.0 | 1.0 | 9393.0 | 204.0 | 2250.0 | 2505.0 | 323.0 | 290.0 |
| challenge-177 | 25.0 | 26.0 | 2.0 | 1.0 | 1.0 | 13.0 | 3.0 | 11.5 | 260.0 | 1.0 | 1.0 |
| challenge-178 | 2751.0 | 2871.0 | 15923.0 | 2072.0 | 8782.0 | 15327.0 | 6078.0 | 1487.5 | 1407.0 | 696.0 | 600.0 |
| challenge-179 | 2401.0 | 2408.0 | 1833.0 | 1.0 | 1.0 | 3.0 | 42.5 | 75.5 | 1.0 | 1.0 | 1.0 |
| challenge-180 | 154.0 | 159.0 | 279.0 | 4.0 | 5.0 | 37.5 | 4.5 | 15.0 | 93.0 | 20.0 | 8.0 |
| challenge-181 | 16296.0 | 16347.0 | 2513.0 | 125.0 | 6.0 | 7998.5 | 1076.5 | 3015.0 | 128.0 | 138.0 | 112.0 |
| challenge-182 | 16.0 | 16.0 | 18.0 | 21.0 | 55.0 | 9509.5 | 8580.5 | 877.5 | 21924.0 | 7408.0 | 4694.0 |
| challenge-183 | 1.5 | 1.5 | 1.5 | 1.5 | 1.5 | 1.0 | 20.5 | 898.0 | 1498.0 | 2.0 | 1.0 |
| challenge-184 | 1.0 | 1.0 | 1.0 | 1.0 | 2.0 | 696.5 | 5.5 | 70.0 | 526.0 | 8.0 | 86.0 |
| challenge-185 | 11818.0 | 11808.0 | 4216.0 | 1.0 | 1.0 | 1291.0 | 7387.0 | 88.0 | 284.0 | 243.0 | 225.0 |
| challenge-186 | 7.0 | 7.0 | 2.0 | 2.0 | 9.0 | 1823.0 | 170.5 | 924.5 | 453.0 | 149.0 | 183.0 |
| challenge-187 | 196.0 | 192.0 | 150.0 | 1.0 | 1.0 | 268.0 | 30.0 | 3.0 | 195.0 | 22.0 | 29.0 |
| challenge-188 | 1.0 | 1.0 | 2.0 | 1.0 | 1.0 | 12690.5 | 2890.5 | 104.0 | 11135.0 | 26.0 | 24.0 |
| challenge-189 | 41.0 | 44.0 | 1481.0 | 1.0 | 3.0 | 5160.0 | 847.0 | 41.5 | 3387.0 | 6.0 | 6.0 |
| challenge-190 | 3375.0 | 3385.0 | 2465.0 | 74.0 | 295.0 | 4082.0 | 452.0 | 832.5 | 663.0 | 378.0 | 360.0 |
| challenge-191 | 1692.0 | 1595.0 | 832.0 | 323.0 | 730.0 | 67.0 | 252.0 | 1124.0 | 217.0 | 62.0 | 85.0 |
| challenge-192 | 6083.0 | 6371.0 | 120.0 | 1.0 | 2.0 | 22.0 | 34.0 | 926.0 | 41.0 | 55.0 | 29.0 |
| challenge-193 | 2.0 | 2.0 | 1.0 | 4.0 | 5.0 | 725.5 | 658.5 | 967.0 | 404.0 | 5.0 | 5.0 |
| challenge-194 | 7425.0 | 7635.0 | 252.0 | 1.0 | 1.0 | 5.0 | 1.0 | 974.5 | 1106.0 | 520.0 | 730.0 |
| challenge-195 | 2.5 | 3.5 | 2.5 | 1.5 | 1.5 | 304.5 | 413.5 | 77.5 | 1449.5 | 1.5 | 1.5 |
| challenge-196 | 1.0 | 1.0 | 2881.0 | 1.0 | 2.0 | 1774.0 | 1084.0 | 205.5 | 70.0 | 50.0 | 34.0 |
| challenge-197 | 35.0 | 36.0 | 34.0 | 36.0 | 37.0 | 689.0 | 168.0 | 31.5 | 29.0 | 5.0 | 5.0 |
| challenge-198 | 1.0 | 1.0 | 1.0 | 1.0 | 4.0 | 4950.0 | 326.5 | 134.5 | 3613.0 | 927.0 | 120.0 |
| challenge-199 | 2779.0 | 2798.0 | 2758.0 | 1893.0 | 1.0 | 16.5 | 17.5 | 38.5 | 1529.0 | 650.0 | 516.0 |
| challenge-200 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.5 | 827.0 | 2391.0 | 23.0 | 2.0 | 1.0 |
| challenge-201 | 26311.0 | 26418.0 | 16069.0 | 1.0 | 4.0 | 1780.0 | 10868.0 | 238.0 | 73.0 | 65.0 | 85.0 |
| challenge-202 | 18585.0 | 18648.0 | 4424.0 | 2.0 | 1.0 | 11516.5 | 3.5 | 288.5 | 49.0 | 72.0 | 122.0 |
| challenge-203 | 1.0 | 1.0 | 1.0 | 1.0 | 3.0 | 10.0 | 1562.0 | 3015.0 | 722.0 | 591.0 | 582.0 |
| challenge-204 | 6458.0 | 6470.0 | 6409.0 | 1.0 | 1.0 | 15.0 | 171.0 | 35.5 | 5387.0 | 104.0 | 79.0 |
| challenge-205 | 2.0 | 2.0 | 3.0 | 2.0 | 1.0 | 26090.5 | 145.5 | 756.0 | 86.0 | 57.0 | 83.0 |
| challenge-206 | 2362.0 | 2370.0 | 2340.0 | 1465.5 | 1.0 | 3.0 | 1.0 | 31.5 | 1141.0 | 274.0 | 362.0 |
| challenge-207 | 10127.0 | 10276.0 | 10372.0 | 6783.0 | 8.0 | 7.0 | 5.0 | 35.0 | 383.0 | 43.0 | 55.0 |
| challenge-208 | 33.0 | 33.0 | 35.0 | 129.0 | 45.0 | 2896.0 | 658.0 | 1118.5 | 549.0 | 589.0 | 477.0 |
| challenge-209 | 19.0 | 21.0 | 263.0 | 2.0 | 3.0 | 17.5 | 44.5 | 241.5 | 137.0 | 2.0 | 2.0 |
| challenge-210 | 2831.0 | 3037.0 | 33.0 | 10.0 | 9.0 | 1223.0 | 63.0 | 51.0 | 1891.0 | 92.0 | 90.0 |
| challenge-211 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 124.5 | 32.5 | 671.5 | 237.0 | 96.0 | 130.0 |
| challenge-212 | 14475.0 | 14789.0 | 15601.0 | 12955.5 | 12.0 | 11801.5 | 351.5 | 282.0 | 7904.0 | 14.0 | 16.0 |
| challenge-213 | 1.0 | 1.0 | 1.0 | 1.0 | 5.0 | 3.0 | 9.0 | 535.0 | 37.0 | 1.0 | 1.0 |
| challenge-214 | 6900.0 | 7420.0 | 9860.0 | 7.0 | 20.0 | 10294.5 | 689.5 | 395.5 | 890.0 | 230.0 | 126.0 |
| challenge-215 | 771.0 | 1064.0 | 10.0 | 25.0 | 2.0 | 25.0 | 28.5 | 2630.5 | 3057.0 | 69.0 | 103.0 |
| challenge-216 | 7.0 | 7.0 | 8.0 | 10.0 | 6.0 | 8.5 | 174.0 | 49.0 | 8.0 | 3.0 | 14.0 |
| challenge-217 | 7.0 | 13.0 | 2.0 | 2.0 | 1.0 | 8.0 | 16.0 | 300.0 | 8.0 | 6.0 | 5.0 |
| challenge-218 | 2.0 | 2.0 | 2.0 | 2.0 | 1.0 | 409.5 | 5.5 | 12.5 | 1190.0 | 1.0 | 2.0 |
| challenge-219 | 7.0 | 6.0 | 1.0 | 1.0 | 1.0 | 198.0 | 629.0 | 62.0 | 4062.0 | 115.0 | 126.0 |
| challenge-220 | 15741.0 | 15789.0 | 1.0 | 1.0 | 2.0 | 2.0 | 30.0 | 300.0 | 1.0 | 4.0 | 1.0 |
| challenge-221 | 18921.0 | 18933.0 | 18485.0 | 5.0 | 9.0 | 3.0 | 174.5 | 331.5 | 757.0 | 9.0 | 6.0 |
| challenge-222 | 12245.0 | 12248.0 | 819.0 | 1.0 | 1.0 | 659.0 | 20.5 | 249.5 | 1241.0 | 33.0 | 47.0 |
| challenge-223 | 45.0 | 41.0 | 1.0 | 1.0 | 1.0 | 130.0 | 94.5 | 3167.0 | 11.0 | 9.0 | 10.0 |
| challenge-224 | 2.0 | 2.0 | 2.0 | 2.0 | 1.0 | 1234.0 | 87.0 | 205.5 | 5774.0 | 164.0 | 103.0 |
| challenge-225 | 1.5 | 1.5 | 1.5 | 392.5 | 14.5 | 368.5 | 41.0 | 256.5 | 552.0 | 342.0 | 328.0 |
| challenge-226 | 508.0 | 633.0 | 929.0 | 1150.5 | 27.0 | 523.0 | 5.0 | - | 153.0 | 72.0 | 286.0 |
| challenge-227 | 2859.0 | 2868.0 | 2909.0 | 44.0 | 263.0 | 71.0 | 8.0 | 38.5 | 1988.0 | 305.0 | 169.0 |
| challenge-228 | 4364.0 | 4626.0 | 4.0 | 16.0 | 291.0 | 2053.5 | 3593.5 | 20.5 | 89.0 | 8.0 | 6.0 |
| challenge-229 | 18594.0 | 18623.0 | 18411.0 | 1.0 | 11.0 | 5.0 | 12.0 | 479.0 | 252.0 | 299.0 | 253.0 |
| challenge-230 | 18994.0 | 18493.0 | 16315.0 | 3.0 | 7.0 | 5828.5 | 8630.5 | 318.5 | 902.0 | 184.0 | 132.0 |
| challenge-231 | 1381.0 | 1381.0 | 1374.0 | 1.0 | 19.0 | 810.0 | 197.0 | 12.0 | 254.0 | 15.0 | 15.0 |
| challenge-232 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 25.0 | 2.0 | 6.0 | 4.0 | 6.0 | 7.0 |
| challenge-233 | 4099.0 | 4221.0 | 253.0 | 1.0 | 20.0 | 240.0 | 193.0 | 44.0 | 1721.0 | 28.0 | 32.0 |
| challenge-234 | 11501.0 | 11535.0 | 11510.0 | 268.0 | 3450.0 | 11342.5 | 438.5 | 789.0 | 1500.0 | 19.0 | 18.0 |
| challenge-235 | 136.0 | 141.0 | 3.0 | 10.0 | 3199.0 | 103.5 | 1798.5 | 1978.5 | 133.0 | 222.0 | 270.0 |
| challenge-236 | 1.0 | 1.0 | 12.0 | 1.0 | 1.0 | 1212.0 | 314.0 | 2003.5 | 3439.0 | 25.0 | 30.0 |
| challenge-237 | 14.0 | 11.0 | 3.0 | 1.0 | 90.5 | 146.0 | 2.0 | 35.0 | 1.0 | 8.0 | 14.0 |
| challenge-238 | 13.0 | 16.0 | 245.0 | 419.5 | 7.0 | 5.5 | 5.5 | 7.5 | 62.0 | 8.0 | 14.0 |
| challenge-239 | 5.0 | 6.0 | 6.0 | 2449.5 | 1.0 | 6.0 | 5.0 | 27.0 | 36.0 | 3.0 | 5.0 |
| challenge-240 | 5011.0 | 5478.0 | 759.0 | 1.0 | 1.0 | 96.0 | 57.0 | 71.5 | 37.0 | 32.0 | 37.0 |
| challenge-241 | 1.0 | 1.0 | 4.0 | 1.0 | 1.0 | 6.0 | 1.0 | 8.5 | 5.0 | 10.0 | 15.0 |
| challenge-242 | 1.0 | 2.0 | 1.0 | 1.0 | 1.0 | 2582.0 | 3.0 | 65.5 | 24.0 | 50.0 | 55.0 |
| challenge-243 | 2.0 | 4.0 | 5.0 | 2.0 | 4.0 | 6.0 | 6.0 | 10.0 | 128.0 | 18.0 | 12.0 |
Participant information and abstracts
ParticipantID: yuanyuelogsum Category: category4 Authors: Yuanyue Li, Michael Kuhn and Peer Bork Affiliations: European Molecular Biology Laboratory, 69117 Heidelberg, Germany Automatic pipeline: yes Spectral libraries: no Abstract: The molcules from the category4 (nonredundant) are used as the candidates. We use a new developed machine learing approach to predict the probability spectrum for each candidate molecule. Then the score was calculated base on the similiarity between the probability spectrum and real spectrum. And the log of the molecules numbers in the model is considered as a weight. In this approach, all the possible adduct are considered [M+H]+, [M+Na]+ and [M+NH4]+ for the positive ions, [M-H]-, [M+Cl]- and [M+COO]- for the negative ions.
ParticipantID: yuanyuesimple Category: category4 Authors: Yuanyue Li, Michael Kuhn and Peer Bork Affiliations: European Molecular Biology Laboratory, 69117 Heidelberg, Germany Automatic pipeline: yes Spectral libraries: no Abstract: The molcules from the category4 (nonredundant) are used as the candidates. We use a new developed machine learing approach to predict the probability spectrum for each candidate molecule. Then the score was calculated base on the similiarity between the probability spectrum and real spectrum. In this approach, only the [M+H]+ is considered for positive ions, the [M-H]- is considered for negative ions.
ParticipantID: yuanyuesqrt Category: category4 Authors: Yuanyue Li, Michael Kuhn and Peer Bork Affiliations: European Molecular Biology Laboratory, 69117 Heidelberg, Germany Automatic pipeline: yes Spectral libraries: no Abstract: The molcules from the category4 (nonredundant) are used as the candidates. We use a new developed machine learing approach to predict the probability spectrum for each candidate molecule. Then the score was calculated base on the similiarity between the probability spectrum and real spectrum. And the square root of intensity is considered as weight. In this approach, only the [M+H]+ is considered for positive ions, the [M-H]- is considered for negative ions.
Participant: Bach
ParticipantID: IOKR_TanimotoGaussian
Category: 4
Authors: Eric Bach(1), Céline Brouard(1,2), Kai Dührkop(3),
Sebastian Böcker(3) and Juho Rousu(1,2)
Affiliations: (1) Department of Computer Science, Aalto University,
Espoo, Finland
(2) Helsinki Institute for Information Technology, Espoo,
Finland
(3) Chair for Bioinformatics, Friedrich-Schiller University,
Jena, Germany
Automatic pipeline: yes
Spectral libraries: no
Abstract
We used a recent machine learning approach, called Input Output Kernel Regression
(IOKR), for predicting the candidate scores. IOKR has been successfully applied
to metabolite identification [1].
In this method kernel functions are used to measure the similarity between MS/MS
spectra (input kernel) respectively between molecular structures (output kernel).
On the input side, we use several kernels defined on MS/MS spectra and fragmentation
trees, and combine them uniformly, i.e. we sum up the kernels with equal weights.
On the output side, we use a Gaussian kernel on Tanimoto features calculated
from binary fingerprints representing the molecular structures in the candidate
sets.
We train two separated IOKR models one for each ionization mode, i.e. positive
and negative. For the positive model we use ~14000 identified MS/MS spectra and
for the negative model ~5800. Those spectra mainly are extracted from the GNPS
and MassBank databases. We represent the candidate molecular structures using
~7600 binary molecular fingerprints.
For each challenge spectra we predict the molecular formula using Sirius [2] by
taking into account the possible molecule formulas based on the candidate sets.
The score we submitted for each candidate is the one corresponding to the most
likely molecular formula.
[1] Brouard, Cé.; Shen, H.; Dührkop, K.; d'Alché Buc, F.; Böcker, S. & Rousu, J.
Fast metabolite identification with Input Output Kernel Regression
Bioinformatics, 2016
[2] https://bio.informatik.uni-jena.de/software/sirius/
ParticipantID: IOKRtransfer
Category: category4
Authors: Céline Brouard(1,2), Eric Bach(1,2), Sebastian Böcker(3)
and Juho Rousu(1,2)
Affiliations: (1) Department of Computer Science, Aalto University,
Espoo, Finland
(2) Helsinki Institute for Information Technology,
Espoo, Finland
(3) Chair for Bioinformatics, Friedrich-Schiller University,
Jena, Germany
Automatic pipeline: yes
Spectral libraries: no
Abstract
During the learning phase, we trained separate models for the MS/MS
spectra in positive mode and the MS/MS spectra in negative mode. 14181
positive mode spectra and 5776 negative mode spectra from GNPS and
MassBank were used for training the models. These models were learned
using a new version of the machine learning method CSI:IOKR, called
CSI:IOKR_transfer. The specificity of this novel approach is that the
knowledge learned in the positive mode set is transferred and used to
learn the negative model (and vice versa).
In the test phase, SIRIUS was first used to compute fragmentation
trees for all the molecular formula appearing in the candidate
set. The trees with a score smaller than 80% were discarded. We then
used CSI:IOKR_transfer to predict the candidate scores for each of the
remaining trees and added a constant value to the scores to make them
positive. The scores in the IOKRtransfer submission were obtained
using the tree associated with the best score. In the second
submission IOKRtransferAvgScore, we averaged the scores obtained using
the trees associated with the five best scores.
ParticipantID: kai_iso
Category: category4
Authors: Dührkop, Kai (1) and Ludwig, Marcus (1) and Böcker, Sebastian (1)
and Bach, Eric (2) and Brouard, Céline (2) and Rousu, Juho (2)
Affiliations: (1) Chair of Bioinformatics, Friedrich-Schiller University, Jena
(2) Department of Computer Science, Aalto University
Developmental Biology, Halle, Germany
Automatic pipeline: yes
Spectral libraries: no
Abstract
We processed the peaklists in MGF format using an in-house version of CSI:FingerID.
Fragmentation trees were computed with Sirius 3.1.5
using the Q-TOF instrument settings.
As the spectra were measured in MSe mode we expect to see isotope
peaks in MSMS. We used an experimental feature in SIRIUS that allows
for detecting isotope patterns in MSMS and incorporate them into the
fragmentation tree scoring.
We used the standard workflow of the SIRIUS+CSI:FingerID (version 3.5) software:
We computed trees for all candidate formulas in the given structure candidate list.
Only the top scoring trees were selected for further processing: Trees
with a score smaller than 75% of the score of the optimal tree were
discarded. Each of these trees was processed with CSI:FingerId as
described in [1]. We predicted for each tree a molecular fingerprint
(with platt probability estimates) and compared them against the
fingerprints of all structure candidates with the same molecular
formula. For comparison of fingerprints, we used the new new maximum
likelihood scoring function which is implemented since SIRIUS 3.5.
The resulting hits were merged together in one list and were sorted by
score. A constant value was added to all scores to make them positive
(as stated in the CASMI rules). Ties of compounds with same score were
ordered randomly. If a compound could not be processed (e.g. because
of multiple charges) its score was set to zero.
[1] Kai Dührkop, Huibin Shen, Marvin Meusel, Juho Rousu and Sebastian
Böcker Searching molecular structure databases with tandem mass
spectra using CSI:FingerID. Proc Natl Acad Sci U S A,
112(41):12580-12585, 2015.
ParticipantID: kai112
Category: category4
Authors: Dührkop, Kai (1) and Ludwig, Marcus (1) and Böcker, Sebastian (1)
and Bach, Eric (2) and Brouard, Céline (2) and Rousu, Juho (2)
Affiliations: (1) Chair of Bioinformatics, Friedrich-Schiller University, Jena
(2) Department of Computer Science, Aalto University
Developmental Biology, Halle, Germany
Automatic pipeline: yes
Spectral libraries: no
Abstract
We processed the peaklists in MGF format using an in-house version of CSI:FingerID.
Fragmentation trees were computed with Sirius 3.1.5
using the Q-TOF instrument settings.
As the spectra were measured in MSe mode we expect to see isotope
peaks in MSMS. We used an experimental feature in SIRIUS that allows
for detecting isotope patterns in MSMS and incorporate them into the
fragmentation tree scoring.
The preliminary results have shown that we miss a lot of compounds
because we were not always able to identify the correct molecular
formula in top ranks. This might be because no isotope patterns for
the precursor were given. So we prepared a second submission kai112
which is not longer using a hard threshold, but instead consider all
molecular formulas for the CSI:FingerID search and add the SIRIUS
score on top of the CSI:FingerID score. To avoid that empty trees
(which we would have thrown away by a hard threshold) get high scores
by random, we add a penalty of 1000 if a tree explains not a single
fragment peak. Furthermore, for the kai112 submission we trained
CSI:FingerID on a larger dataset that contains also spectra from NIST.
Beside removing the hard threshold, the kai112 submission follows the
standard SIRIUS+CSI:FingerID protocol: We computed trees for all
candidate formulas in the given structure candidate list. Each of
these trees was processed with CSI:FingerId as described in [1]. We
predicted for each tree a molecular fingerprint (with platt
probability estimates) and compared them against the fingerprints of
all structure candidates with the same molecular formula. For
comparison of fingerprints, we used the new new maximum likelihood
scoring function which is implemented since SIRIUS 3.5. Trees with
one node get penalty of 1000. For all other trees, the SIRIUS score
was added to the CSI:FingerID score. The resulting hits were merged
together in one list and were sorted by score. A constant value was
added to all scores to make them positive (as stated in the CASMI
rules). Ties of compounds with same score were ordered randomly. If a
compound could not be processed (e.g. because of multiple charges) its
score was set to zero.
[1] Kai Dührkop, Huibin Shen, Marvin Meusel, Juho Rousu and Sebastian
Böcker Searching molecular structure databases with tandem mass
spectra using CSI:FingerID. Proc Natl Acad Sci U S A,
112(41):12580-12585, 2015.