Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
KennethEnevoldsen committed Jun 16, 2024
2 parents 7570f9c + 443fe03 commit 677dc93
Show file tree
Hide file tree
Showing 18 changed files with 2,849 additions and 25 deletions.
2 changes: 2 additions & 0 deletions docs/mmteb/points/915.jsonl
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"GitHub": "gentaiscool", "New dataset": 18}
{"GitHub": "KennethEnevoldsen", "Review PR": 2}
2 changes: 2 additions & 0 deletions docs/mmteb/points/927.jsonl
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"GitHub": "gentaiscool", "New dataset": 18}
{"GitHub": "KennethEnevoldsen", "Review PR": 2}
2 changes: 2 additions & 0 deletions docs/mmteb/points/928.jsonl
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"GitHub": "gentaiscool", "New dataset": 2}
{"GitHub": "KennethEnevoldsen", "Review PR": 2}
16 changes: 8 additions & 8 deletions docs/mmteb/points_table.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ _Note_: this table is **autogenerated** and should not be edited. It is intended

| GitHub | New dataset | Review PR | Coordination | Bug fixes | Dataset annotations | Running Models | Paper writing | New task | Total |
|:------------------|--------------:|------------:|---------------:|------------:|----------------------:|-----------------:|----------------:|-----------:|--------:|
| KennethEnevoldsen | 68 | 256 | 11 | 81 | 35 | 0 | 0 | 0 | 451 |
| KennethEnevoldsen | 68 | 262 | 11 | 81 | 35 | 0 | 0 | 0 | 457 |
| isaac-chung | 116 | 182 | 4 | 40 | 1 | 0 | 4 | 0 | 347 |
| awinml | 292 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 294 |
| imenelydiaker | 120 | 140 | 0 | 20 | 0 | 0 | 0 | 0 | 280 |
Expand All @@ -14,12 +14,12 @@ _Note_: this table is **autogenerated** and should not be edited. It is intended
| wissam-sib | 134 | 6 | 0 | 4 | 0 | 0 | 0 | 0 | 144 |
| jupyterjazz | 108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 108 |
| SaitejaUtpala | 102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 102 |
| gentaiscool | 102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 102 |
| dokato | 82 | 4 | 0 | 8 | 0 | 0 | 0 | 0 | 94 |
| MathieuCiancone | 88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 88 |
| schmarion | 88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 88 |
| MathieuCiancone | 88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 88 |
| GabrielSequeira | 88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 88 |
| digantamisra98 | 71 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 71 |
| gentaiscool | 64 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 64 |
| shreeya-dhakal | 54 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 62 |
| Rysias | 58 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 58 |
| asparius | 34 | 14 | 0 | 0 | 0 | 0 | 0 | 0 | 48 |
Expand All @@ -30,23 +30,23 @@ _Note_: this table is **autogenerated** and should not be edited. It is intended
| rafalposwiata | 36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 36 |
| bp-high | 36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 36 |
| akshita-sukhlecha | 34 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 34 |
| ShawonAshraf | 28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 28 |
| jphme | 28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 28 |
| rasdani | 28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 28 |
| ShawonAshraf | 28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 28 |
| loicmagne | 0 | 0 | 0 | 28 | 0 | 0 | 0 | 0 | 28 |
| bjoernpl | 28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 28 |
| violenil | 26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 26 |
| kranthigv | 20 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 26 |
| dwzhu-pku | 24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 24 |
| taeminlee | 22 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 22 |
| jankounchained | 14 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 22 |
| taeminlee | 22 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 22 |
| crystina-z | 21 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 21 |
| mrshu | 16 | 4 | 0 | 0 | 1 | 0 | 0 | 0 | 21 |
| hgissbkh | 0 | 0 | 0 | 13 | 0 | 0 | 3 | 5 | 21 |
| mrshu | 16 | 4 | 0 | 0 | 1 | 0 | 0 | 0 | 21 |
| mmhamdy | 20 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 20 |
| rbroc | 20 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 20 |
| Andrian0s | 14 | 4 | 0 | 2 | 0 | 0 | 0 | 0 | 20 |
| AlexeyVatolin | 0 | 0 | 0 | 20 | 0 | 0 | 0 | 0 | 20 |
| mmhamdy | 20 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 20 |
| Andrian0s | 14 | 4 | 0 | 2 | 0 | 0 | 0 | 0 | 20 |
| ManuelFay | 2 | 0 | 0 | 13 | 0 | 0 | 0 | 5 | 20 |
| manandey | 18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 18 |
| MartinBernstorff | 2 | 8 | 0 | 7 | 0 | 0 | 0 | 0 | 17 |
Expand Down
35 changes: 19 additions & 16 deletions docs/tasks.md
Original file line number Diff line number Diff line change
Expand Up @@ -332,10 +332,13 @@ The following tables give you an overview of the tasks in MTEB.
| [News21InstructionRetrieval](https://arxiv.org/abs/2403.15246) (Orion Weller, 2024) | ['eng'] | InstructionRetrieval | s2p | [News] | {'eng': 61906} | {'eng': 2983.724665391969} |
| [NewsClassification](https://arxiv.org/abs/1509.01626) (Zhang et al., 2015) | ['eng'] | Classification | s2s | [News] | {'test': 7600} | {'test': 235.29} |
| [NoRecClassification](https://aclanthology.org/L18-1661/) | ['nob'] | Classification | s2s | | {'test': 2050} | {'test': 82.0} |
| [NollySentiBitextMining](https://github.com/IyanuSh/NollySenti) (Shode et al., 2023) | ['eng', 'hau', 'ibo', 'pcm', 'yor'] | BitextMining | s2s | [Social, Reviews] | {'train': 1640} | {'train': 135.91} |
| [NorQuadRetrieval](https://aclanthology.org/2023.nodalida-1.17/) | ['nob'] | Retrieval | p2p | [Encyclopaedic, Non-fiction] | {'test': 2602} | {'test': 502.19} |
| [NordicLangClassification](https://aclanthology.org/2021.vardial-1.8/) | ['dan', 'fao', 'isl', 'nno', 'nob', 'swe'] | Classification | s2s | | {'test': 3000} | {'test': 78.2} |
| [NorwegianCourtsBitextMining](https://opus.nlpl.eu/index.php) (Tiedemann et al., 2020) | ['nno', 'nob'] | BitextMining | s2s | [Legal] | {'test': 2050} | {'test': 1884.0} |
| [NorwegianParliamentClassification](https://huggingface.co/datasets/NbAiLab/norwegian_parliament) | ['nob'] | Classification | s2s | | {'test': 1200, 'validation': 1200} | {'test': 1884.0, 'validation': 1911.0} |
| [NusaParagraphEmotionClassification](https://github.com/IndoNLP/nusa-writes) | ['bbc', 'bew', 'bug', 'jav', 'mad', 'mak', 'min', 'mui', 'rej', 'sun'] | Classification | s2s | [Non-fiction, Fiction] | {'train': 15516, 'validation': 2948, 'test': 6250} | {'train': 740.24, 'validation': 740.66, 'test': 740.71} |
| [NusaParagraphTopicClassification](https://github.com/IndoNLP/nusa-writes) | ['bbc', 'bew', 'bug', 'jav', 'mad', 'mak', 'min', 'mui', 'rej', 'sun'] | Classification | s2s | [Non-fiction, Fiction] | {'train': 15516, 'validation': 2948, 'test': 6250} | {'train': 740.24, 'validation': 740.66, 'test': 740.71} |
| [NusaTranslationBitextMining](https://huggingface.co/datasets/indonlp/nusatranslation_mt) (Cahyawijaya et al., 2023) | ['abs', 'bbc', 'bew', 'bhp', 'ind', 'jav', 'mad', 'mak', 'min', 'mui', 'rej', 'sun'] | BitextMining | s2s | [Social] | {'train': 50200} | {'train': 147.01} |
| [NusaX-senti](https://arxiv.org/abs/2205.15960) (Winata et al., 2022) | ['ace', 'ban', 'bbc', 'bjn', 'bug', 'eng', 'ind', 'jav', 'mad', 'min', 'nij', 'sun'] | Classification | s2s | [Reviews, Web, Social, Constructed] | {'test': 4800} | {'test': 52.4} |
| [NusaXBitextMining](https://huggingface.co/datasets/indonlp/NusaX-senti/) (Winata et al., 2023) | ['ace', 'ban', 'bbc', 'bjn', 'bug', 'eng', 'ind', 'jav', 'mad', 'min', 'nij', 'sun'] | BitextMining | s2s | [Reviews] | {'train': 5500} | {'train': 157.15} |
Expand Down Expand Up @@ -642,7 +645,7 @@ The following tables give you an overview of the tasks in MTEB.
| bao | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bba | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bbb | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bbc | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bbc | 2 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bbr | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bch | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bco | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand All @@ -655,7 +658,7 @@ The following tables give you an overview of the tasks in MTEB.
| beo | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ber | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| beu | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bew | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bew | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bgc | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bgs | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bgt | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -703,7 +706,7 @@ The following tables give you an overview of the tasks in MTEB.
| bsn | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bsp | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bss | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bug | 2 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bug | 2 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| buk | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| bul | 3 | 5 | 0 | 0 | 1 | 1 | 1 | 2 | 0 | 0 |
| bus | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -821,7 +824,7 @@ The following tables give you an overview of the tasks in MTEB.
| ell | 3 | 7 | 0 | 0 | 1 | 2 | 0 | 3 | 0 | 0 |
| emi | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| emp | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| eng | 13 | 144 | 15 | 3 | 1 | 8 | 7 | 54 | 13 | 1 |
| eng | 14 | 144 | 15 | 3 | 1 | 8 | 7 | 54 | 13 | 1 |
| enq | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| epo | 3 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| eri | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -893,7 +896,7 @@ The following tables give you an overview of the tasks in MTEB.
| gym | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| gyr | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| hat | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| hau | 3 | 6 | 2 | 0 | 0 | 0 | 0 | 1 | 1 | 0 |
| hau | 4 | 6 | 2 | 0 | 0 | 0 | 0 | 1 | 1 | 0 |
| haw | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| hbo | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| hch | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -921,7 +924,7 @@ The following tables give you an overview of the tasks in MTEB.
| hvn | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| hye | 3 | 4 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 |
| ian | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ibo | 2 | 6 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| ibo | 3 | 6 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| ido | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ign | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ikk | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand All @@ -943,7 +946,7 @@ The following tables give you an overview of the tasks in MTEB.
| jac | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| jae | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| jao | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| jav | 4 | 6 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| jav | 4 | 8 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| jic | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| jid | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| jiv | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1068,11 +1071,11 @@ The following tables give you an overview of the tasks in MTEB.
| lvs | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| lww | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| maa | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mad | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mad | 2 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mag | 1 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mai | 4 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| maj | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mak | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mak | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mal | 7 | 8 | 1 | 0 | 0 | 0 | 0 | 2 | 1 | 0 |
| mam | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| maq | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1115,7 +1118,7 @@ The following tables give you an overview of the tasks in MTEB.
| mig | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mih | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mil | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| min | 3 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| min | 3 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mio | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mir | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mit | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1157,7 +1160,7 @@ The following tables give you an overview of the tasks in MTEB.
| msy | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mti | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mto | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mui | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mui | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mup | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| mux | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| muy | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1265,7 +1268,7 @@ The following tables give you an overview of the tasks in MTEB.
| pao | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| pap | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| pbt | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| pcm | 0 | 4 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| pcm | 1 | 4 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| pes | 3 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| pib | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| pio | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1314,7 +1317,7 @@ The following tables give you an overview of the tasks in MTEB.
| rai | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| raj | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| reg | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| rej | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| rej | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| rgu | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| rkb | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| rmc | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1387,7 +1390,7 @@ The following tables give you an overview of the tasks in MTEB.
| stp | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| sua | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| sue | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| sun | 3 | 3 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| sun | 3 | 5 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| sus | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| suz | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| svk | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1549,7 +1552,7 @@ The following tables give you an overview of the tasks in MTEB.
| yle | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| yml | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| yon | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| yor | 3 | 6 | 2 | 0 | 0 | 0 | 1 | 1 | 0 | 0 |
| yor | 4 | 6 | 2 | 0 | 0 | 0 | 1 | 1 | 0 | 0 |
| yrb | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| yre | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| yss | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Expand Down Expand Up @@ -1593,7 +1596,7 @@ The following tables give you an overview of the tasks in MTEB.
| zty | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| zul | 2 | 4 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 |
| zyp | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Total | 1385 | 970 | 107 | 3 | 28 | 67 | 46 | 335 | 85 | 2 |
| Total | 1390 | 990 | 107 | 3 | 28 | 67 | 46 | 335 | 85 | 2 |
<!-- TASK LANG TABLE END -->

</details>
1 change: 1 addition & 0 deletions mteb/abstasks/TaskMetadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@
"Cross-Lingual Semantic Discrimination",
"Textual Entailment",
"Counterfactual Detection",
"Emotion classification",
]

TASK_DOMAIN = Literal[
Expand Down
1 change: 1 addition & 0 deletions mteb/tasks/BitextMining/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from .multilingual.IN22GenBitextMining import *
from .multilingual.IndicGenBenchFloresBitextMining import *
from .multilingual.IWSLT2017BitextMinig import *
from .multilingual.NollySentiBitextMining import *
from .multilingual.NorwegianCourtsBitextMining import *
from .multilingual.NTREXBitextMining import *
from .multilingual.NusaTranslationBitextMining import *
Expand Down
Loading

0 comments on commit 677dc93

Please sign in to comment.