model_type
stringclasses 5
values | model
stringlengths 12
62
| AVG
float64 0.03
0.74
| CG
float64 0
0.76
| EL
float64 0
0.77
| FA
float64 0
0.62
| HE
float64 0
0.83
| MC
float64 0
0.95
| MR
float64 0
0.95
| MT
float64 0.19
0.86
| NLI
float64 0
0.97
| QA
float64 0
0.77
| RC
float64 0
0.94
| SUM
float64 0
0.29
| aio_char_f1
float64 0
0.9
| alt-e-to-j_bert_score_ja_f1
float64 0
0.88
| alt-e-to-j_bleu_ja
float64 0
16
| alt-e-to-j_comet_wmt22
float64 0.2
0.92
| alt-j-to-e_bert_score_en_f1
float64 0
0.96
| alt-j-to-e_bleu_en
float64 0
20.1
| alt-j-to-e_comet_wmt22
float64 0.17
0.89
| chabsa_set_f1
float64 0
0.77
| commonsensemoralja_exact_match
float64 0
0.94
| jamp_exact_match
float64 0
1
| janli_exact_match
float64 0
1
| jcommonsenseqa_exact_match
float64 0
0.98
| jemhopqa_char_f1
float64 0
0.71
| jmmlu_exact_match
float64 0
0.81
| jnli_exact_match
float64 0
0.94
| jsem_exact_match
float64 0
0.96
| jsick_exact_match
float64 0
0.93
| jsquad_char_f1
float64 0
0.94
| jsts_pearson
float64 -0.35
0.94
| jsts_spearman
float64 -0.6
0.91
| kuci_exact_match
float64 0
0.93
| mawps_exact_match
float64 0
0.95
| mbpp_code_exec
float64 0
0.76
| mbpp_pylint_check
float64 0
1
| mmlu_en_exact_match
float64 0
0.86
| niilc_char_f1
float64 0
0.7
| wiki_coreference_set_f1
float64 0
0.4
| wiki_dependency_set_f1
float64 0
0.89
| wiki_ner_set_f1
float64 0
0.33
| wiki_pas_set_f1
float64 0
0.57
| wiki_reading_char_f1
float64 0
0.94
| wikicorpus-e-to-j_bert_score_ja_f1
float64 0
0.88
| wikicorpus-e-to-j_bleu_ja
float64 0
24
| wikicorpus-e-to-j_comet_wmt22
float64 0.18
0.87
| wikicorpus-j-to-e_bert_score_en_f1
float64 0
0.93
| wikicorpus-j-to-e_bleu_en
float64 0
15.9
| wikicorpus-j-to-e_comet_wmt22
float64 0.17
0.79
| xlsum_ja_bert_score_ja_f1
float64 0
0.79
| xlsum_ja_bleu_ja
float64 0
10.2
| xlsum_ja_rouge1
float64 0
54
| xlsum_ja_rouge2
float64 0
29.2
| xlsum_ja_rouge2_scaling
float64 0
0.29
| xlsum_ja_rougeLsum
float64 0
45.6
| architecture
stringclasses 12
values | precision
stringclasses 3
values | license
stringclasses 15
values | params
float64 0
70.6
| likes
int64 0
6.19k
| revision
stringclasses 1
value | num_few_shot
int64 0
4
| add_special_tokens
stringclasses 2
values | llm_jp_eval_version
stringclasses 1
value | vllm_version
stringclasses 1
value |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
⭕ : instruction-tuned
|
meta-llama/Meta-Llama-3-8B-Instruct
| 0.2164
| 0
| 0.0052
| 0.061
| 0.0016
| 0.3121
| 0.022
| 0.764
| 0.3398
| 0.1971
| 0.664
| 0.0138
| 0.1527
| 0.8169
| 8.0639
| 0.8448
| 0.9364
| 11.7792
| 0.8494
| 0.0052
| 0
| 0.4368
| 0.5681
| 0.5898
| 0.2475
| 0
| 0.2999
| 0.0038
| 0.3905
| 0.664
| 0.7159
| 0.7167
| 0.3465
| 0.022
| 0
| 0.012
| 0.0033
| 0.191
| 0
| 0.0038
| 0
| 0
| 0.301
| 0.7386
| 5.917
| 0.7027
| 0.874
| 8.2024
| 0.659
| 0.6158
| 0.341
| 5.7664
| 1.371
| 0.0138
| 5.064
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 3,974
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Meta-Llama-3-8B-Instruct
| 0.4962
| 0
| 0.4758
| 0.2329
| 0.5383
| 0.753
| 0.732
| 0.8226
| 0.6059
| 0.3892
| 0.8946
| 0.0138
| 0.3947
| 0.8444
| 10.3245
| 0.8826
| 0.9437
| 14.5237
| 0.8623
| 0.4758
| 0.7801
| 0.5144
| 0.6431
| 0.8767
| 0.3812
| 0.4668
| 0.6109
| 0.6439
| 0.6172
| 0.8946
| 0.8216
| 0.8015
| 0.6024
| 0.732
| 0
| 0.012
| 0.6098
| 0.3918
| 0.0189
| 0.3115
| 0.0796
| 0.0805
| 0.674
| 0.8115
| 8.9124
| 0.8116
| 0.8951
| 9.6677
| 0.7337
| 0.6158
| 0.341
| 5.7664
| 1.371
| 0.0138
| 5.064
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 3,974
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.2-3B
| 0.2124
| 0.1707
| 0.0098
| 0.0394
| 0.0288
| 0.3178
| 0.016
| 0.6915
| 0.4239
| 0.1329
| 0.4803
| 0.0256
| 0.1056
| 0.7762
| 6.4825
| 0.7696
| 0.9114
| 9.3192
| 0.803
| 0.0098
| 0.5303
| 0.3793
| 0.5
| 0.1716
| 0.1396
| 0.0364
| 0.3016
| 0.6414
| 0.2971
| 0.4803
| 0
| 0
| 0.2515
| 0.016
| 0.1707
| 0.5141
| 0.0211
| 0.1536
| 0
| 0.002
| 0
| 0
| 0.1951
| 0.7016
| 4.4551
| 0.6147
| 0.8282
| 4.6005
| 0.5786
| 0.5844
| 1.066
| 12.6667
| 2.5479
| 0.0256
| 10.9324
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 3.213
| 568
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.2-3B
| 0.4225
| 0.1707
| 0.3862
| 0.1803
| 0.409
| 0.5501
| 0.616
| 0.7875
| 0.3375
| 0.3568
| 0.828
| 0.0256
| 0.3136
| 0.8282
| 9.0363
| 0.8587
| 0.9381
| 13.7033
| 0.8508
| 0.3862
| 0.7347
| 0.3391
| 0.5264
| 0.5898
| 0.4377
| 0.3423
| 0.3053
| 0.3327
| 0.1841
| 0.828
| 0.0481
| 0.0889
| 0.3259
| 0.616
| 0.1707
| 0.5141
| 0.4758
| 0.3191
| 0.0106
| 0.217
| 0.0177
| 0.0543
| 0.6019
| 0.7744
| 8.4631
| 0.7492
| 0.8819
| 9.0517
| 0.6912
| 0.5844
| 1.066
| 12.6667
| 2.5479
| 0.0256
| 10.9324
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 3.213
| 568
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.2-1B
| 0.2788
| 0.1044
| 0.2863
| 0.0576
| 0.254
| 0.3175
| 0.158
| 0.6971
| 0.3619
| 0.2338
| 0.5554
| 0.0405
| 0.1684
| 0.7838
| 6.7944
| 0.7733
| 0.9181
| 11.059
| 0.7996
| 0.2863
| 0.493
| 0.3563
| 0.4986
| 0.2082
| 0.3274
| 0.2448
| 0.5559
| 0.1667
| 0.2322
| 0.5554
| -0.1145
| -0.1219
| 0.2514
| 0.158
| 0.1044
| 0.5964
| 0.2631
| 0.2056
| 0.0039
| 0.0802
| 0
| 0.0196
| 0.1842
| 0.7014
| 6.145
| 0.6129
| 0.8536
| 6.4619
| 0.6024
| 0.5952
| 1.1235
| 12.2139
| 4.0405
| 0.0405
| 10.3837
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 1.236
| 1,910
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.2-1B
| 0.1055
| 0.1044
| 0
| 0.0343
| 0.0489
| 0.115
| 0.006
| 0.4856
| 0
| 0.1045
| 0.2211
| 0.0405
| 0.0655
| 0.6909
| 0.4139
| 0.5445
| 0.7509
| 0.2066
| 0.5013
| 0
| 0.1906
| 0
| 0
| 0.0009
| 0.1438
| 0.0124
| 0
| 0
| 0
| 0.2211
| -0.0273
| -0.0221
| 0.1533
| 0.006
| 0.1044
| 0.5964
| 0.0853
| 0.1042
| 0
| 0
| 0
| 0
| 0.1714
| 0.6585
| 0.6969
| 0.4661
| 0.7447
| 0.2973
| 0.4303
| 0.5952
| 1.1235
| 12.2139
| 4.0405
| 0.0405
| 10.3837
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 1.236
| 1,910
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.1-70B
| 0.5941
| 0.0141
| 0.5255
| 0.2959
| 0.6845
| 0.8791
| 0.88
| 0.8556
| 0.7397
| 0.6487
| 0.9229
| 0.089
| 0.7296
| 0.8715
| 13.7469
| 0.911
| 0.9583
| 18.699
| 0.8902
| 0.5255
| 0.9063
| 0.5977
| 0.7889
| 0.9419
| 0.6145
| 0.6351
| 0.7506
| 0.7961
| 0.765
| 0.9229
| 0.9018
| 0.8696
| 0.7891
| 0.88
| 0.0141
| 0.0281
| 0.7339
| 0.602
| 0.0565
| 0.3599
| 0.1239
| 0.0671
| 0.872
| 0.8556
| 15.4783
| 0.8542
| 0.9143
| 12.6925
| 0.767
| 0.6861
| 2.1497
| 22.176
| 8.9001
| 0.089
| 19.3898
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 366
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Llama-3.1-70B
| 0.4186
| 0.0141
| 0.3199
| 0.1591
| 0.2133
| 0.5708
| 0.494
| 0.838
| 0.6159
| 0.3977
| 0.8933
| 0.089
| 0.4266
| 0.8508
| 11.455
| 0.9003
| 0.9533
| 16.5452
| 0.8831
| 0.3199
| 0.7375
| 0.4856
| 0.6778
| 0.5541
| 0.3359
| 0.1065
| 0.4281
| 0.7544
| 0.7335
| 0.8933
| 0.69
| 0.7523
| 0.421
| 0.494
| 0.0141
| 0.0281
| 0.3202
| 0.4306
| 0
| 0.0179
| 0.0177
| 0.0024
| 0.7572
| 0.8049
| 8.9761
| 0.8185
| 0.9031
| 10.4305
| 0.75
| 0.6861
| 2.1497
| 22.176
| 8.9001
| 0.089
| 19.3898
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 366
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-1.1-7b-it
| 0.149
| 0
| 0.0082
| 0.0191
| 0.2669
| 0.3215
| 0.056
| 0.2866
| 0.4414
| 0.0915
| 0.1224
| 0.0257
| 0.0225
| 0.6226
| 0.0924
| 0.3582
| 0.8167
| 3.2642
| 0.3855
| 0.0082
| 0.4679
| 0.3391
| 0.5
| 0.2502
| 0.1948
| 0.2776
| 0.5534
| 0.5934
| 0.2212
| 0.1224
| 0.0158
| -0.0669
| 0.2464
| 0.056
| 0
| 0
| 0.2562
| 0.0573
| 0
| 0
| 0.0177
| 0
| 0.0779
| 0.5129
| 0.0698
| 0.1785
| 0.765
| 2.6153
| 0.2243
| 0.5724
| 1.4248
| 14.2917
| 2.5693
| 0.0257
| 11.9555
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 273
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-1.1-7b-it
| 0.0655
| 0
| 0
| 0.0167
| 0.0089
| 0.0909
| 0
| 0.3642
| 0
| 0.0603
| 0.1541
| 0.0257
| 0.0331
| 0.6638
| 0.2425
| 0.4453
| 0.7408
| 0.3664
| 0.3466
| 0
| 0.0075
| 0
| 0
| 0.0715
| 0.0847
| 0
| 0
| 0
| 0
| 0.1541
| -0.0452
| -0.041
| 0.1937
| 0
| 0
| 0
| 0.0179
| 0.063
| 0
| 0
| 0
| 0
| 0.0835
| 0.5913
| 0.6693
| 0.3503
| 0.724
| 0.5251
| 0.3145
| 0.5724
| 1.4248
| 14.2917
| 2.5693
| 0.0257
| 11.9555
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 273
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-1.1-2b-it
| 0.0952
| 0
| 0
| 0.0247
| 0.0468
| 0.078
| 0.006
| 0.4931
| 0.0017
| 0.061
| 0.2914
| 0.0446
| 0.0499
| 0.6982
| 0.2783
| 0.548
| 0.7591
| 0.3118
| 0.5581
| 0
| 0
| 0
| 0.0083
| 0.0009
| 0.0443
| 0.0799
| 0
| 0
| 0
| 0.2914
| 0
| 0
| 0.2332
| 0.006
| 0
| 0
| 0.0136
| 0.0887
| 0
| 0
| 0
| 0
| 0.1235
| 0.6541
| 0.597
| 0.4451
| 0.7534
| 0.5065
| 0.421
| 0.6555
| 1.0033
| 24.5415
| 4.4743
| 0.0446
| 19.6346
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 159
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-1.1-2b-it
| 0.2876
| 0
| 0.2493
| 0.0624
| 0.3238
| 0.3532
| 0.328
| 0.5771
| 0.3758
| 0.2095
| 0.6401
| 0.0446
| 0.1351
| 0.7513
| 4.6594
| 0.6695
| 0.8988
| 9.3482
| 0.7202
| 0.2493
| 0.4647
| 0.3333
| 0.5
| 0.3432
| 0.3782
| 0.3067
| 0.3164
| 0.5676
| 0.1618
| 0.6401
| 0.0602
| 0.074
| 0.2517
| 0.328
| 0
| 0
| 0.341
| 0.1153
| 0.0063
| 0.0289
| 0.0531
| 0.0131
| 0.2107
| 0.5938
| 0.1743
| 0.3774
| 0.8342
| 4.955
| 0.5412
| 0.6555
| 1.0033
| 24.5415
| 4.4743
| 0.0446
| 19.6346
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 159
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2b-it
| 0.274
| 0
| 0.2404
| 0.0464
| 0.3186
| 0.3678
| 0.238
| 0.552
| 0.4062
| 0.1644
| 0.631
| 0.0493
| 0.1191
| 0.7334
| 3.6557
| 0.6267
| 0.8865
| 7.8681
| 0.6728
| 0.2404
| 0.4749
| 0.3333
| 0.5
| 0.378
| 0.3014
| 0.2999
| 0.4671
| 0.5688
| 0.1618
| 0.631
| 0
| 0
| 0.2505
| 0.238
| 0
| 0
| 0.3373
| 0.0728
| 0.0033
| 0.0162
| 0.0265
| 0.0083
| 0.1778
| 0.6264
| 0.1199
| 0.4039
| 0.8268
| 4.9351
| 0.5046
| 0.6583
| 1.183
| 25.2621
| 4.9277
| 0.0493
| 19.8895
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 744
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2b-it
| 0.0954
| 0
| 0
| 0.0228
| 0.0002
| 0.1768
| 0
| 0.4571
| 0.0721
| 0.0572
| 0.2137
| 0.0493
| 0.0453
| 0.6949
| 0.2568
| 0.5367
| 0.7531
| 0.1883
| 0.4722
| 0
| 0.384
| 0
| 0.3458
| 0
| 0.0324
| 0.0003
| 0.0148
| 0
| 0
| 0.2137
| -0.0554
| -0.0492
| 0.1464
| 0
| 0
| 0
| 0.0001
| 0.0939
| 0
| 0
| 0
| 0
| 0.114
| 0.6539
| 0.4013
| 0.4424
| 0.7482
| 0.1519
| 0.377
| 0.6583
| 1.183
| 25.2621
| 4.9277
| 0.0493
| 19.8895
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 744
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
google/gemma-7b
| 0.0383
| 0
| 0
| 0.0005
| 0.0004
| 0.1606
| 0
| 0.2273
| 0
| 0.0082
| 0.0241
| 0
| 0.0008
| 0.4752
| 0.2585
| 0.203
| 0.3384
| 0.4239
| 0.2719
| 0
| 0.0003
| 0
| 0
| 0.2386
| 0.0147
| 0.0006
| 0
| 0
| 0
| 0.0241
| 0.0017
| 0.0096
| 0.2429
| 0
| 0
| 0.8414
| 0.0002
| 0.0089
| 0
| 0
| 0
| 0
| 0.0027
| 0.4224
| 0.5033
| 0.188
| 0.4309
| 0.542
| 0.2463
| 0.2849
| 0.0329
| 0.0152
| 0
| 0
| 0.0149
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 3,168
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
google/gemma-7b
| 0.036
| 0
| 0
| 0.0005
| 0
| 0.1547
| 0.002
| 0.186
| 0.009
| 0.0105
| 0.0327
| 0
| 0.0005
| 0.4809
| 0.131
| 0.2199
| 0.6864
| 2.4121
| 0.1721
| 0
| 0.0008
| 0.0057
| 0
| 0.2154
| 0.0279
| 0
| 0.0008
| 0.0328
| 0.0057
| 0.0327
| -0.0083
| -0.009
| 0.248
| 0.002
| 0
| 0.8414
| 0
| 0.003
| 0
| 0
| 0
| 0
| 0.0026
| 0.4446
| 0.1832
| 0.1822
| 0.6847
| 3.6753
| 0.17
| 0.2849
| 0.0329
| 0.0152
| 0
| 0
| 0.0149
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 3,168
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-7b-it
| 0.045
| 0
| 0
| 0.0212
| 0.0001
| 0
| 0
| 0.3439
| 0
| 0.0387
| 0.0606
| 0.03
| 0.0244
| 0.6495
| 0.1817
| 0.4037
| 0.7356
| 0.0919
| 0.3215
| 0
| 0
| 0
| 0
| 0
| 0.0627
| 0
| 0
| 0
| 0
| 0.0606
| 0.0047
| 0.0112
| 0
| 0
| 0
| 0.0361
| 0.0002
| 0.0289
| 0
| 0
| 0
| 0
| 0.1062
| 0.6091
| 0.3699
| 0.3527
| 0.7188
| 0.1897
| 0.2978
| 0.6181
| 1.961
| 18.9243
| 3.009
| 0.03
| 15.5497
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 1,174
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-7b-it
| 0.1498
| 0
| 0.004
| 0.0234
| 0.2555
| 0.3292
| 0.01
| 0.3307
| 0.5303
| 0.0721
| 0.0628
| 0.03
| 0.0463
| 0.6154
| 0.0374
| 0.3252
| 0.842
| 5.3864
| 0.4182
| 0.004
| 0.522
| 0.3276
| 0.4819
| 0.2154
| 0.125
| 0.2409
| 0.5534
| 0.6717
| 0.617
| 0.0628
| -0.0171
| -0.0193
| 0.2501
| 0.01
| 0
| 0.0361
| 0.2701
| 0.0451
| 0
| 0
| 0.0088
| 0
| 0.108
| 0.564
| 0.0654
| 0.2756
| 0.757
| 1.9916
| 0.3038
| 0.6181
| 1.961
| 18.9243
| 3.009
| 0.03
| 15.5497
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 8.538
| 1,174
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
google/gemma-2b
| 0.2882
| 0
| 0.3342
| 0.0775
| 0.3135
| 0.3492
| 0.23
| 0.5792
| 0.3935
| 0.1294
| 0.7133
| 0.0506
| 0.0673
| 0.7181
| 6.7048
| 0.6014
| 0.9083
| 10.2007
| 0.7617
| 0.3342
| 0.5862
| 0.3333
| 0.5
| 0.2109
| 0.2467
| 0.2923
| 0.3279
| 0.6471
| 0.1593
| 0.7133
| 0.05
| 0.0564
| 0.2506
| 0.23
| 0
| 0.3273
| 0.3348
| 0.0741
| 0.0009
| 0.0781
| 0.0531
| 0.029
| 0.2265
| 0.6219
| 1.4302
| 0.4864
| 0.8032
| 4.9462
| 0.4672
| 0.6424
| 1.8859
| 19.7977
| 5.0582
| 0.0506
| 16.1766
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 998
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
google/gemma-2b
| 0.0693
| 0
| 0
| 0.0349
| 0
| 0
| 0
| 0.4957
| 0
| 0.0359
| 0.1455
| 0.0506
| 0.0361
| 0.6923
| 0.235
| 0.5342
| 0.7523
| 0.3109
| 0.5812
| 0
| 0
| 0
| 0
| 0
| 0.0443
| 0
| 0
| 0
| 0
| 0.1455
| 0.0506
| 0.0499
| 0
| 0
| 0
| 0.3273
| 0
| 0.0272
| 0
| 0
| 0
| 0
| 0.1746
| 0.638
| 0.4192
| 0.4291
| 0.7161
| 0.3383
| 0.4384
| 0.6424
| 1.8859
| 19.7977
| 5.0582
| 0.0506
| 16.1766
|
GemmaForCausalLM
|
bfloat16
|
gemma
| 2.506
| 998
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.3-70B-Instruct
| 0.3749
| 0.0281
| 0.1971
| 0.1527
| 0.2179
| 0.849
| 0.026
| 0.8351
| 0.6637
| 0.42
| 0.6432
| 0.0918
| 0.4565
| 0.8626
| 13.3468
| 0.9045
| 0.956
| 17.4829
| 0.886
| 0.1971
| 0.9003
| 0.5891
| 0.5181
| 0.9035
| 0.4258
| 0.1053
| 0.597
| 0.791
| 0.8232
| 0.6432
| 0.8824
| 0.853
| 0.7431
| 0.026
| 0.0281
| 0.0522
| 0.3304
| 0.3777
| 0.0032
| 0.0266
| 0.0442
| 0.0023
| 0.6871
| 0.8091
| 10.7789
| 0.8156
| 0.8966
| 11.1103
| 0.7344
| 0.6878
| 3.1368
| 22.6097
| 9.1828
| 0.0918
| 20.1245
|
LlamaForCausalLM
|
bfloat16
|
llama3.3
| 70.554
| 2,335
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.3-70B-Instruct
| 0.6145
| 0.0281
| 0.5723
| 0.2659
| 0.7714
| 0.8874
| 0.944
| 0.8519
| 0.8016
| 0.6412
| 0.9042
| 0.0918
| 0.7084
| 0.8671
| 13.5853
| 0.9095
| 0.9577
| 18.1284
| 0.888
| 0.5723
| 0.9121
| 0.6724
| 0.9125
| 0.9455
| 0.6528
| 0.7266
| 0.8114
| 0.7942
| 0.8175
| 0.9042
| 0.8833
| 0.8483
| 0.8046
| 0.944
| 0.0281
| 0.0522
| 0.8162
| 0.5625
| 0.0418
| 0.3363
| 0.0442
| 0.0568
| 0.8501
| 0.8502
| 15.8625
| 0.8517
| 0.9089
| 12.3046
| 0.7584
| 0.6878
| 3.1368
| 22.6097
| 9.1828
| 0.0918
| 20.1245
|
LlamaForCausalLM
|
bfloat16
|
llama3.3
| 70.554
| 2,335
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Meta-Llama-3-70B-Instruct
| 0.6257
| 0.2892
| 0.5891
| 0.2922
| 0.7256
| 0.8576
| 0.922
| 0.8509
| 0.7307
| 0.6125
| 0.9167
| 0.0957
| 0.673
| 0.8644
| 12.538
| 0.9044
| 0.9544
| 16.8976
| 0.8836
| 0.5891
| 0.881
| 0.6839
| 0.7194
| 0.9419
| 0.6102
| 0.6721
| 0.6873
| 0.779
| 0.7836
| 0.9167
| 0.8834
| 0.8525
| 0.7499
| 0.922
| 0.2892
| 0.4177
| 0.7792
| 0.5544
| 0.0416
| 0.3764
| 0.1239
| 0.0798
| 0.8393
| 0.8346
| 11.3104
| 0.8471
| 0.9097
| 11.0488
| 0.7685
| 0.6943
| 2.0042
| 24.8641
| 9.5922
| 0.0957
| 21.5437
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 1,472
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Meta-Llama-3-70B-Instruct
| 0.5107
| 0.2892
| 0.242
| 0.1712
| 0.5743
| 0.7024
| 0.756
| 0.8417
| 0.6258
| 0.4434
| 0.8756
| 0.0957
| 0.484
| 0.8584
| 11.9451
| 0.9046
| 0.9475
| 15.5203
| 0.8733
| 0.242
| 0.7365
| 0.5345
| 0.6097
| 0.7909
| 0.381
| 0.5304
| 0.4737
| 0.7544
| 0.7566
| 0.8756
| 0.8678
| 0.8315
| 0.5799
| 0.756
| 0.2892
| 0.4177
| 0.6183
| 0.4652
| 0
| 0.0073
| 0.0531
| 0.0082
| 0.7874
| 0.8105
| 8.8033
| 0.8318
| 0.9034
| 10.031
| 0.7574
| 0.6943
| 2.0042
| 24.8641
| 9.5922
| 0.0957
| 21.5437
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 1,472
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Meta-Llama-3-70B
| 0.6273
| 0.2992
| 0.5339
| 0.2942
| 0.718
| 0.8799
| 0.948
| 0.8531
| 0.7315
| 0.6465
| 0.9212
| 0.0745
| 0.7347
| 0.8708
| 13.9748
| 0.9088
| 0.9581
| 18.9863
| 0.8888
| 0.5339
| 0.9013
| 0.5891
| 0.7806
| 0.9473
| 0.6056
| 0.6919
| 0.7079
| 0.7942
| 0.7857
| 0.9212
| 0.8903
| 0.855
| 0.7911
| 0.948
| 0.2992
| 0.4659
| 0.7441
| 0.5991
| 0.0306
| 0.3763
| 0.1416
| 0.0545
| 0.8682
| 0.8584
| 16.1156
| 0.8554
| 0.9138
| 12.5835
| 0.7594
| 0.6697
| 2.0276
| 18.0392
| 7.4331
| 0.0745
| 15.9677
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 859
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Meta-Llama-3-70B
| 0.5012
| 0.2992
| 0.3387
| 0.162
| 0.4276
| 0.6513
| 0.742
| 0.8352
| 0.603
| 0.4888
| 0.8908
| 0.0745
| 0.5411
| 0.8503
| 11.5686
| 0.898
| 0.9544
| 17.117
| 0.884
| 0.3387
| 0.7928
| 0.4626
| 0.6528
| 0.6765
| 0.4114
| 0.3849
| 0.3936
| 0.7475
| 0.7587
| 0.8908
| 0.7909
| 0.8021
| 0.4846
| 0.742
| 0.2992
| 0.4659
| 0.4702
| 0.5138
| 0
| 0.0193
| 0.0177
| 0.0064
| 0.7668
| 0.81
| 9.5953
| 0.8131
| 0.9041
| 10.688
| 0.7458
| 0.6697
| 2.0276
| 18.0392
| 7.4331
| 0.0745
| 15.9677
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 859
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Meta-Llama-3-8B
| 0.4886
| 0.0181
| 0.4059
| 0.2451
| 0.5186
| 0.668
| 0.704
| 0.8244
| 0.6068
| 0.4325
| 0.8889
| 0.0625
| 0.4481
| 0.8511
| 11.7555
| 0.8829
| 0.9494
| 15.7335
| 0.8733
| 0.4059
| 0.6568
| 0.454
| 0.5986
| 0.8293
| 0.4521
| 0.4473
| 0.6085
| 0.7317
| 0.6414
| 0.8889
| 0.7472
| 0.7306
| 0.5178
| 0.704
| 0.0181
| 0.0502
| 0.5899
| 0.3972
| 0.0084
| 0.3696
| 0.0531
| 0.0536
| 0.7409
| 0.8146
| 11.0876
| 0.8053
| 0.9005
| 11.05
| 0.7361
| 0.6283
| 1.7042
| 15.2962
| 6.2471
| 0.0625
| 13.5321
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 6,187
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
meta-llama/Meta-Llama-3-8B
| 0.293
| 0.0181
| 0.0194
| 0.0514
| 0.2166
| 0.3855
| 0.274
| 0.7839
| 0.3773
| 0.2665
| 0.7677
| 0.0625
| 0.265
| 0.8269
| 9.5102
| 0.861
| 0.9423
| 13.0823
| 0.8607
| 0.0194
| 0.5529
| 0.3276
| 0.5
| 0.3342
| 0.2906
| 0.1401
| 0.1619
| 0.6566
| 0.2403
| 0.7677
| 0.346
| 0.3449
| 0.2695
| 0.274
| 0.0181
| 0.0502
| 0.2932
| 0.2438
| 0
| 0.0006
| 0
| 0
| 0.2562
| 0.7569
| 7.1002
| 0.7336
| 0.8746
| 8.1538
| 0.6805
| 0.6283
| 1.7042
| 15.2962
| 6.2471
| 0.0625
| 13.5321
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 6,187
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3.1-13b-instruct4
| 0.6489
| 0.0281
| 0.7513
| 0.6006
| 0.5413
| 0.9395
| 0.72
| 0.8517
| 0.9625
| 0.6931
| 0.9427
| 0.1075
| 0.8045
| 0.8717
| 14.9427
| 0.9028
| 0.947
| 17.8751
| 0.864
| 0.7513
| 0.9376
| 0.9971
| 1
| 0.9598
| 0.6372
| 0.5267
| 0.9433
| 0.9609
| 0.9111
| 0.9427
| 0.9352
| 0.9071
| 0.9212
| 0.72
| 0.0281
| 0.0643
| 0.556
| 0.6375
| 0.3544
| 0.879
| 0.3274
| 0.4999
| 0.9421
| 0.8683
| 21.9099
| 0.8555
| 0.9249
| 15.3226
| 0.7845
| 0.7001
| 2.6706
| 31.4294
| 10.749
| 0.1075
| 26.3431
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 3
|
main
| 4
|
True
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3.1-13b-instruct4
| 0.6525
| 0.0281
| 0.7652
| 0.6207
| 0.53
| 0.945
| 0.728
| 0.8501
| 0.965
| 0.6929
| 0.9448
| 0.1075
| 0.8068
| 0.8624
| 15.6982
| 0.8911
| 0.9443
| 18.4501
| 0.8552
| 0.7652
| 0.9384
| 0.9971
| 1
| 0.9705
| 0.6352
| 0.5078
| 0.94
| 0.9621
| 0.9255
| 0.9448
| 0.9375
| 0.9092
| 0.9262
| 0.728
| 0.0281
| 0.0643
| 0.5523
| 0.6368
| 0.4014
| 0.882
| 0.3097
| 0.5677
| 0.9425
| 0.8767
| 24.0066
| 0.864
| 0.9276
| 15.8636
| 0.7902
| 0.7001
| 2.6706
| 31.4294
| 10.749
| 0.1075
| 26.3431
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 3
|
main
| 0
|
True
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3.1-1.8b-instruct4
| 0.5788
| 0
| 0.6167
| 0.5125
| 0.4345
| 0.8995
| 0.56
| 0.8458
| 0.9207
| 0.5653
| 0.924
| 0.0876
| 0.6161
| 0.8691
| 14.7817
| 0.9002
| 0.9481
| 15.9402
| 0.8705
| 0.6167
| 0.8833
| 0.9368
| 1
| 0.9294
| 0.5416
| 0.4293
| 0.908
| 0.8737
| 0.8851
| 0.924
| 0.9244
| 0.8961
| 0.8858
| 0.56
| 0
| 0
| 0.4398
| 0.5382
| 0.2833
| 0.828
| 0.1416
| 0.4029
| 0.9066
| 0.8568
| 19.6487
| 0.8452
| 0.9167
| 13.916
| 0.7672
| 0.6856
| 2.2083
| 25.6171
| 8.7508
| 0.0876
| 21.5264
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 1
|
main
| 4
|
True
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3.1-1.8b-instruct4
| 0.5788
| 0
| 0.6494
| 0.5384
| 0.4216
| 0.9041
| 0.506
| 0.8498
| 0.9244
| 0.5596
| 0.9257
| 0.0876
| 0.6106
| 0.8701
| 14.1502
| 0.9003
| 0.9503
| 17.1237
| 0.8734
| 0.6494
| 0.89
| 0.9339
| 1
| 0.9339
| 0.5337
| 0.414
| 0.9092
| 0.8782
| 0.9005
| 0.9257
| 0.9267
| 0.8975
| 0.8885
| 0.506
| 0
| 0
| 0.4293
| 0.5346
| 0.2777
| 0.8565
| 0.2035
| 0.4376
| 0.9166
| 0.8618
| 20.4743
| 0.8499
| 0.9215
| 14.6513
| 0.7756
| 0.6856
| 2.2083
| 25.6171
| 8.7508
| 0.0876
| 21.5264
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 1
|
main
| 0
|
True
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3.1-13b
| 0.5882
| 0.004
| 0.5203
| 0.4026
| 0.5261
| 0.8905
| 0.724
| 0.8545
| 0.8499
| 0.6827
| 0.9307
| 0.085
| 0.8321
| 0.8723
| 13.6152
| 0.9106
| 0.9576
| 18.3868
| 0.8885
| 0.5203
| 0.8923
| 0.7126
| 0.9806
| 0.9473
| 0.5706
| 0.5117
| 0.8669
| 0.8314
| 0.8581
| 0.9307
| 0.9196
| 0.8911
| 0.832
| 0.724
| 0.004
| 0.006
| 0.5405
| 0.6455
| 0.0299
| 0.7093
| 0.1327
| 0.238
| 0.9029
| 0.8553
| 16.1892
| 0.851
| 0.9162
| 12.7161
| 0.768
| 0.685
| 2.3289
| 21.9176
| 8.4942
| 0.085
| 19.1341
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 0
|
main
| 4
|
True
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3.1-13b
| 0.4095
| 0.004
| 0.3876
| 0.1542
| 0.2512
| 0.5654
| 0.212
| 0.834
| 0.6639
| 0.4284
| 0.9184
| 0.085
| 0.5277
| 0.8481
| 13.228
| 0.9055
| 0.9437
| 16.8932
| 0.8496
| 0.3876
| 0
| 0.8362
| 0.5403
| 0.9142
| 0.3152
| 0.4222
| 0.8537
| 0.2045
| 0.8847
| 0.9184
| 0.9064
| 0.8793
| 0.782
| 0.212
| 0.004
| 0.006
| 0.0802
| 0.4423
| 0.0025
| 0.0069
| 0.0101
| 0.0931
| 0.6586
| 0.812
| 10.0165
| 0.8353
| 0.9037
| 10.9126
| 0.7455
| 0.685
| 2.3289
| 21.9176
| 8.4942
| 0.085
| 19.1341
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 0
|
main
| 0
|
True
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3.1-1.8b
| 0.4928
| 0
| 0.4566
| 0.2997
| 0.4041
| 0.7019
| 0.58
| 0.8263
| 0.6552
| 0.5253
| 0.8832
| 0.0889
| 0.6205
| 0.8575
| 11.9047
| 0.8955
| 0.9477
| 15.1667
| 0.8717
| 0.4566
| 0.755
| 0.5029
| 0.6681
| 0.7292
| 0.4734
| 0.4106
| 0.7831
| 0.7664
| 0.5557
| 0.8832
| 0.7827
| 0.5909
| 0.6216
| 0.58
| 0
| 0
| 0.3977
| 0.482
| 0.0139
| 0.5153
| 0.0973
| 0.0851
| 0.7869
| 0.8023
| 8.8342
| 0.8035
| 0.901
| 10.8811
| 0.7344
| 0.6873
| 2.3093
| 23.9284
| 8.8985
| 0.0889
| 20.2442
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 0
|
main
| 4
|
True
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3.1-1.8b
| 0.3772
| 0
| 0.2045
| 0.1382
| 0.2004
| 0.649
| 0.196
| 0.8171
| 0.7007
| 0.2839
| 0.8704
| 0.0889
| 0.3739
| 0.8387
| 11.0389
| 0.8932
| 0.9453
| 13.6524
| 0.8668
| 0.2045
| 0.5874
| 0.569
| 0.6694
| 0.7105
| 0.1858
| 0.1418
| 0.7732
| 0.6806
| 0.8114
| 0.8704
| 0.7751
| 0.7857
| 0.649
| 0.196
| 0
| 0
| 0.2591
| 0.2919
| 0
| 0.0135
| 0
| 0.005
| 0.6724
| 0.7691
| 7.2238
| 0.7829
| 0.8939
| 9.2184
| 0.7255
| 0.6873
| 2.3093
| 23.9284
| 8.8985
| 0.0889
| 20.2442
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 0
|
main
| 0
|
True
|
v1.4.1
|
v0.6.3.post1
|
🤝 : base merges and moerges
|
DreadPoor/Irix-12B-Model_Stock
| 0.5179
| 0.4839
| 0.3289
| 0.1535
| 0.4789
| 0.7513
| 0.674
| 0.8316
| 0.6508
| 0.3845
| 0.8501
| 0.1095
| 0.4532
| 0.8412
| 10.2281
| 0.8991
| 0.9489
| 14.3765
| 0.8779
| 0.3289
| 0.8001
| 0.5201
| 0.7528
| 0.8543
| 0.3182
| 0.4434
| 0.4334
| 0.762
| 0.7859
| 0.8501
| 0.851
| 0.8087
| 0.5994
| 0.674
| 0.4839
| 0.9578
| 0.5145
| 0.3822
| 0.0108
| 0.003
| 0.0619
| 0.011
| 0.6807
| 0.7897
| 8.064
| 0.8115
| 0.893
| 9.0049
| 0.738
| 0.7016
| 2.9623
| 28.906
| 10.9409
| 0.1095
| 24.9871
|
MistralForCausalLM
|
bfloat16
| 12.248
| 17
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
|
🤝 : base merges and moerges
|
DreadPoor/Irix-12B-Model_Stock
| 0.5967
| 0.4839
| 0.5329
| 0.2627
| 0.6044
| 0.8483
| 0.762
| 0.8435
| 0.6972
| 0.5165
| 0.903
| 0.1095
| 0.5393
| 0.8574
| 12.0996
| 0.9046
| 0.9515
| 15.8011
| 0.8817
| 0.5329
| 0.895
| 0.5115
| 0.7778
| 0.9258
| 0.5083
| 0.5563
| 0.7609
| 0.7683
| 0.6675
| 0.903
| 0.8728
| 0.8468
| 0.724
| 0.762
| 0.4839
| 0.9578
| 0.6524
| 0.5017
| 0.0479
| 0.3445
| 0.0708
| 0.0782
| 0.7722
| 0.8204
| 10.2716
| 0.8343
| 0.9002
| 9.8325
| 0.7535
| 0.7016
| 2.9623
| 28.906
| 10.9409
| 0.1095
| 24.9871
|
MistralForCausalLM
|
bfloat16
| 12.248
| 17
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
|
🤝 : base merges and moerges
|
yamatazen/Twilight-SCE-12B-v2
| 0.2821
| 0.3092
| 0.2124
| 0.2439
| 0.001
| 0.5506
| 0.056
| 0.8301
| 0.2204
| 0.3091
| 0.2396
| 0.1303
| 0.2567
| 0.8572
| 11.3708
| 0.9049
| 0.948
| 15.704
| 0.8714
| 0.2124
| 0
| 0.158
| 0
| 0.9214
| 0.2544
| 0.002
| 0.3381
| 0.2418
| 0.3641
| 0.2396
| 0.8695
| 0.838
| 0.7303
| 0.056
| 0.3092
| 0.5502
| 0.0001
| 0.4163
| 0.0238
| 0.3194
| 0.0531
| 0.0789
| 0.744
| 0.8152
| 10.1501
| 0.8264
| 0.8868
| 9.9912
| 0.7179
| 0.7189
| 2.8479
| 38.0982
| 13.0247
| 0.1303
| 30.8768
|
MistralForCausalLM
|
bfloat16
| 12.248
| 4
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
|
🤝 : base merges and moerges
|
yamatazen/Twilight-SCE-12B-v2
| 0.2902
| 0.3092
| 0.1292
| 0.1461
| 0.0055
| 0.439
| 0.54
| 0.7984
| 0.1402
| 0.2666
| 0.2876
| 0.1303
| 0.3922
| 0.8383
| 10.5277
| 0.8963
| 0.9376
| 14.8491
| 0.8384
| 0.1292
| 0
| 0.2644
| 0
| 0.7632
| 0.0493
| 0.0011
| 0.1582
| 0
| 0.2783
| 0.2876
| 0.8544
| 0.8098
| 0.5539
| 0.54
| 0.3092
| 0.5502
| 0.01
| 0.3583
| 0.0092
| 0.0012
| 0.0265
| 0.0044
| 0.6893
| 0.7828
| 7.6749
| 0.8051
| 0.8638
| 8.4941
| 0.6539
| 0.7189
| 2.8479
| 38.0982
| 13.0247
| 0.1303
| 30.8768
|
MistralForCausalLM
|
bfloat16
| 12.248
| 4
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
|
⭕ : instruction-tuned
|
deep-analysis-research/test-qwen2.5-32b
| 0.6553
| 0.5281
| 0.5894
| 0.2737
| 0.7757
| 0.8966
| 0.944
| 0.8479
| 0.8106
| 0.541
| 0.9047
| 0.097
| 0.553
| 0.8644
| 13.2738
| 0.9081
| 0.9554
| 17.7737
| 0.8859
| 0.5894
| 0.8975
| 0.6724
| 0.8431
| 0.958
| 0.5672
| 0.7515
| 0.8973
| 0.7835
| 0.8569
| 0.9047
| 0.8895
| 0.877
| 0.8343
| 0.944
| 0.5281
| 0.755
| 0.8
| 0.5029
| 0.0543
| 0.3837
| 0
| 0.1104
| 0.8204
| 0.8291
| 10.9975
| 0.8389
| 0.9045
| 11.1213
| 0.7585
| 0.6926
| 2.7959
| 25.855
| 9.7054
| 0.097
| 22.5323
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
deep-analysis-research/test-qwen2.5-32b
| 0.5443
| 0.5281
| 0.107
| 0.1453
| 0.568
| 0.8739
| 0.79
| 0.8386
| 0.7647
| 0.3873
| 0.8871
| 0.097
| 0.4392
| 0.8489
| 11.3776
| 0.9009
| 0.9511
| 15.766
| 0.8797
| 0.107
| 0.9046
| 0.6494
| 0.7944
| 0.9303
| 0.2681
| 0.5561
| 0.82
| 0.798
| 0.7615
| 0.8871
| 0.8951
| 0.8761
| 0.7869
| 0.79
| 0.5281
| 0.755
| 0.58
| 0.4547
| 0.0281
| 0.0071
| 0.0354
| 0.0058
| 0.6499
| 0.8004
| 8.7234
| 0.8268
| 0.8969
| 9.5439
| 0.7471
| 0.6926
| 2.7959
| 25.855
| 9.7054
| 0.097
| 22.5323
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/test-qwen2.5-14b-wo-system-v1
| 0.623
| 0
| 0.6726
| 0.5121
| 0.7244
| 0.9311
| 0.856
| 0.7405
| 0.9341
| 0.4865
| 0.9044
| 0.091
| 0.5712
| 0.847
| 15.0786
| 0.838
| 0.9572
| 19.1294
| 0.887
| 0.6726
| 0.9279
| 0.9511
| 0.9986
| 0.9643
| 0.5762
| 0.6865
| 0.9228
| 0.8939
| 0.9042
| 0.9044
| 0.927
| 0.9012
| 0.9013
| 0.856
| 0
| 0.004
| 0.7622
| 0.3121
| 0.2985
| 0.8677
| 0
| 0.4678
| 0.9266
| 0.67
| 20.8196
| 0.5074
| 0.9047
| 14.9092
| 0.7295
| 0.6743
| 7.4941
| 17.3256
| 9.1051
| 0.091
| 15.5768
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.77
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/test-qwen2.5-14b-wo-system-v1
| 0.5436
| 0
| 0.5998
| 0.4853
| 0.6351
| 0.935
| 0.464
| 0.6654
| 0.9426
| 0.3284
| 0.833
| 0.091
| 0.4524
| 0.7093
| 14.3812
| 0.5919
| 0.957
| 18.8974
| 0.8858
| 0.5998
| 0.9331
| 0.9655
| 1
| 0.9643
| 0.4843
| 0.6176
| 0.933
| 0.8971
| 0.9176
| 0.833
| 0.9273
| 0.9028
| 0.9076
| 0.464
| 0
| 0.004
| 0.6525
| 0.0485
| 0.242
| 0.8858
| 0
| 0.379
| 0.9197
| 0.6568
| 21.8939
| 0.4864
| 0.8944
| 15.0852
| 0.6977
| 0.6743
| 7.4941
| 17.3256
| 9.1051
| 0.091
| 15.5768
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.77
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
yellowtown/Japanese-Qwen2.5-14B-Instruct-V1
| 0.6638
| 0.5964
| 0.62
| 0.4027
| 0.704
| 0.8808
| 0.822
| 0.7893
| 0.8198
| 0.5275
| 0.8783
| 0.2605
| 0.5448
| 0.8639
| 12.5691
| 0.9002
| 0.926
| 16.4489
| 0.797
| 0.62
| 0.8875
| 0.6925
| 0.9417
| 0.95
| 0.5597
| 0.6792
| 0.8862
| 0.7809
| 0.7978
| 0.8783
| 0.9125
| 0.8842
| 0.8048
| 0.822
| 0.5964
| 0.8996
| 0.7289
| 0.4779
| 0.1603
| 0.71
| 0.0177
| 0.2513
| 0.8744
| 0.8341
| 13.1918
| 0.8345
| 0.8647
| 11.6461
| 0.6256
| 0.7828
| 7.5932
| 51.3622
| 26.0449
| 0.2605
| 43.3101
|
Qwen2ForCausalLM
|
bfloat16
|
other
| 14.77
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
yellowtown/Japanese-Qwen2.5-14B-Instruct-V1
| 0.6812
| 0.5964
| 0.6519
| 0.408
| 0.7403
| 0.8899
| 0.868
| 0.7778
| 0.8145
| 0.5723
| 0.914
| 0.2605
| 0.5586
| 0.8647
| 12.9409
| 0.9014
| 0.9535
| 16.9098
| 0.8802
| 0.6519
| 0.8798
| 0.6839
| 0.9014
| 0.9625
| 0.6072
| 0.7083
| 0.8887
| 0.7816
| 0.8171
| 0.914
| 0.9193
| 0.8896
| 0.8275
| 0.868
| 0.5964
| 0.8996
| 0.7723
| 0.551
| 0.1704
| 0.7174
| 0.0088
| 0.2584
| 0.885
| 0.7923
| 14.4908
| 0.7568
| 0.8528
| 11.5903
| 0.5727
| 0.7828
| 7.5932
| 51.3622
| 26.0449
| 0.2605
| 43.3101
|
Qwen2ForCausalLM
|
bfloat16
|
other
| 14.77
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/D2IL-Japanese-Qwen2.5-32B-Instruct-v0.1
| 0.71
| 0.6084
| 0.6782
| 0.4321
| 0.7902
| 0.9139
| 0.938
| 0.7954
| 0.8793
| 0.5897
| 0.9005
| 0.2843
| 0.5928
| 0.8642
| 14.0819
| 0.8872
| 0.9557
| 18.1003
| 0.8832
| 0.6782
| 0.9123
| 0.7874
| 0.9917
| 0.9651
| 0.6449
| 0.7673
| 0.9104
| 0.8251
| 0.8821
| 0.9005
| 0.9345
| 0.9101
| 0.8643
| 0.938
| 0.6084
| 0.8153
| 0.8131
| 0.5313
| 0.2131
| 0.7041
| 0.0088
| 0.3116
| 0.9228
| 0.8497
| 16.6929
| 0.8414
| 0.8532
| 12.2253
| 0.5697
| 0.7925
| 8.0612
| 53.4088
| 28.4139
| 0.2843
| 45.313
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/D2IL-Japanese-Qwen2.5-32B-Instruct-v0.1
| 0.5729
| 0.6084
| 0.5899
| 0.4076
| 0.7265
| 0.9047
| 0.032
| 0.8139
| 0.8653
| 0.4293
| 0.6402
| 0.2843
| 0.5512
| 0.8713
| 13.7411
| 0.8999
| 0.957
| 17.9677
| 0.8867
| 0.5899
| 0.9028
| 0.7931
| 0.9833
| 0.9589
| 0.5787
| 0.6696
| 0.8747
| 0.8131
| 0.8624
| 0.6402
| 0.9334
| 0.9095
| 0.8524
| 0.032
| 0.6084
| 0.8153
| 0.7835
| 0.1579
| 0.1409
| 0.7266
| 0.0265
| 0.2361
| 0.9078
| 0.8461
| 16.1777
| 0.8408
| 0.8696
| 12.0218
| 0.6282
| 0.7925
| 8.0612
| 53.4088
| 28.4139
| 0.2843
| 45.313
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟦 : RL-tuned (Preference optimization)
|
RekaAI/reka-flash-3.1
| 0.3808
| 0
| 0.2905
| 0.1496
| 0.4779
| 0.5621
| 0.15
| 0.7756
| 0.6233
| 0.2309
| 0.8311
| 0.0982
| 0.2234
| 0.7969
| 7.0794
| 0.7958
| 0.9438
| 12.9156
| 0.8699
| 0.2905
| 0.5829
| 0.523
| 0.7125
| 0.6604
| 0.1869
| 0.4462
| 0.5887
| 0.7569
| 0.5352
| 0.8311
| 0.4547
| 0.4738
| 0.4429
| 0.15
| 0
| 0
| 0.5095
| 0.2823
| 0.0083
| 0.0192
| 0.0354
| 0.0019
| 0.6831
| 0.742
| 5.2058
| 0.7097
| 0.8884
| 8.601
| 0.727
| 0.6981
| 2.5179
| 28.2482
| 9.8321
| 0.0982
| 24.2453
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 20.905
| 65
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟦 : RL-tuned (Preference optimization)
|
RekaAI/reka-flash-3.1
| 0.5453
| 0
| 0.5609
| 0.2911
| 0.6036
| 0.8484
| 0.746
| 0.8352
| 0.6955
| 0.4184
| 0.9013
| 0.0982
| 0.3842
| 0.8577
| 12.0096
| 0.9023
| 0.9501
| 15.3753
| 0.8786
| 0.5609
| 0.8742
| 0.6207
| 0.7514
| 0.9142
| 0.4479
| 0.5755
| 0.749
| 0.7374
| 0.6192
| 0.9013
| 0.8941
| 0.8614
| 0.7568
| 0.746
| 0
| 0
| 0.6317
| 0.4229
| 0.0034
| 0.4425
| 0.0973
| 0.0751
| 0.8372
| 0.8037
| 8.7905
| 0.8103
| 0.8992
| 9.6683
| 0.7496
| 0.6981
| 2.5179
| 28.2482
| 9.8321
| 0.0982
| 24.2453
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 20.905
| 65
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Gemma-2-Llama-Swallow-27b-it-v0.1
| 0.5305
| 0.0341
| 0.494
| 0.2457
| 0.5808
| 0.6566
| 0.798
| 0.8342
| 0.6334
| 0.5383
| 0.9159
| 0.1044
| 0.72
| 0.8736
| 13.7033
| 0.9095
| 0.9534
| 17.9655
| 0.8792
| 0.494
| 0.6308
| 0.5172
| 0.7069
| 0.8865
| 0.4843
| 0.5442
| 0.5698
| 0.7658
| 0.6071
| 0.9159
| 0.7836
| 0.775
| 0.4525
| 0.798
| 0.0341
| 0.0964
| 0.6174
| 0.4105
| 0.0152
| 0.3275
| 0.0619
| 0.0608
| 0.763
| 0.8213
| 11.6736
| 0.8105
| 0.9026
| 11.1599
| 0.7374
| 0.6967
| 2.2009
| 30.8517
| 10.4405
| 0.1044
| 25.4934
|
Gemma2ForCausalLM
|
bfloat16
|
gemma;llama3.3
| 27.227
| 2
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Gemma-2-Llama-Swallow-27b-it-v0.1
| 0.2651
| 0.0341
| 0
| 0.1007
| 0.0297
| 0.5259
| 0.156
| 0.6502
| 0.4492
| 0.1522
| 0.7132
| 0.1044
| 0.1826
| 0.7556
| 6.33
| 0.6847
| 0.8815
| 9.9343
| 0.7712
| 0
| 0.7665
| 0.4282
| 0.6417
| 0.4504
| 0.1179
| 0.0573
| 0.2219
| 0.6439
| 0.3103
| 0.7132
| 0.6589
| 0.6906
| 0.3609
| 0.156
| 0.0341
| 0.0964
| 0.0021
| 0.156
| 0
| 0
| 0.0088
| 0
| 0.4947
| 0.6903
| 4.0924
| 0.5484
| 0.8328
| 6.5642
| 0.5965
| 0.6967
| 2.2009
| 30.8517
| 10.4405
| 0.1044
| 25.4934
|
Gemma2ForCausalLM
|
bfloat16
|
gemma;llama3.3
| 27.227
| 2
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
nvidia/AceReason-Nemotron-14B
| 0.208
| 0.1345
| 0.1748
| 0.0516
| 0.0065
| 0.2898
| 0.002
| 0.7397
| 0.1511
| 0.1414
| 0.4902
| 0.1061
| 0.0903
| 0.7816
| 7.0068
| 0.787
| 0.9141
| 10.7782
| 0.8127
| 0.1748
| 0.0296
| 0.2701
| 0
| 0.3485
| 0.1542
| 0.0003
| 0.2301
| 0.0057
| 0.2498
| 0.4902
| 0.6837
| 0.6948
| 0.4914
| 0.002
| 0.1345
| 0.3032
| 0.0127
| 0.1797
| 0.005
| 0.0103
| 0.0206
| 0
| 0.2218
| 0.7303
| 6.2135
| 0.6917
| 0.8605
| 6.917
| 0.6674
| 0.6967
| 2.41
| 30.3184
| 10.6083
| 0.1061
| 22.1073
|
Qwen2ForCausalLM
|
bfloat16
|
other
| 14.77
| 91
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
nvidia/AceReason-Nemotron-14B
| 0.5689
| 0.1345
| 0.5732
| 0.2131
| 0.6711
| 0.8341
| 0.784
| 0.8262
| 0.7619
| 0.4532
| 0.9004
| 0.1061
| 0.4274
| 0.8457
| 10.4264
| 0.8904
| 0.9478
| 15.1999
| 0.8749
| 0.5732
| 0.8712
| 0.6063
| 0.7736
| 0.9285
| 0.5068
| 0.6312
| 0.8225
| 0.7753
| 0.8319
| 0.9004
| 0.8657
| 0.8384
| 0.7027
| 0.784
| 0.1345
| 0.3032
| 0.711
| 0.4253
| 0.0309
| 0.2923
| 0.0885
| 0.0388
| 0.6149
| 0.798
| 8.8732
| 0.8014
| 0.8946
| 9.9112
| 0.7382
| 0.6967
| 2.41
| 30.3184
| 10.6083
| 0.1061
| 22.1073
|
Qwen2ForCausalLM
|
bfloat16
|
other
| 14.77
| 91
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
YOYO-AI/Qwen2.5-14B-YOYO-Average
| 0.4603
| 0.3394
| 0.2351
| 0.1532
| 0.5269
| 0.5837
| 0.528
| 0.838
| 0.6107
| 0.3622
| 0.7855
| 0.1008
| 0.4067
| 0.8462
| 10.8475
| 0.8999
| 0.9511
| 15.6442
| 0.8788
| 0.2351
| 0.0772
| 0.6092
| 0.7944
| 0.9339
| 0.323
| 0.584
| 0.5004
| 0.6566
| 0.493
| 0.7855
| 0.8866
| 0.8597
| 0.7401
| 0.528
| 0.3394
| 0.5301
| 0.4698
| 0.357
| 0.0181
| 0.0106
| 0.0177
| 0.0003
| 0.7195
| 0.7952
| 8.5901
| 0.8229
| 0.8967
| 9.7236
| 0.7504
| 0.6944
| 2.7462
| 26.267
| 10.074
| 0.1008
| 22.7566
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.766
| 1
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
YOYO-AI/Qwen2.5-14B-YOYO-Average
| 0.6303
| 0.3394
| 0.5906
| 0.2708
| 0.7558
| 0.8888
| 0.892
| 0.8493
| 0.7851
| 0.5472
| 0.9132
| 0.1008
| 0.5486
| 0.8651
| 12.8167
| 0.9092
| 0.955
| 17.2624
| 0.8861
| 0.5906
| 0.9033
| 0.6466
| 0.8208
| 0.9607
| 0.5514
| 0.73
| 0.8706
| 0.786
| 0.8017
| 0.9132
| 0.9025
| 0.8772
| 0.8025
| 0.892
| 0.3394
| 0.5301
| 0.7817
| 0.5415
| 0.0568
| 0.3967
| 0.0265
| 0.0744
| 0.7994
| 0.8279
| 10.987
| 0.8424
| 0.9053
| 11.0235
| 0.7594
| 0.6944
| 2.7462
| 26.267
| 10.074
| 0.1008
| 22.7566
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.766
| 1
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/Flux-Japanese-Qwen2.5-32B-Instruct-V1.0
| 0.735
| 0.761
| 0.7117
| 0.4786
| 0.7829
| 0.9124
| 0.932
| 0.8521
| 0.8861
| 0.5793
| 0.907
| 0.2821
| 0.5754
| 0.8703
| 13.5403
| 0.9027
| 0.9567
| 18.1787
| 0.8864
| 0.7117
| 0.9191
| 0.8534
| 0.9889
| 0.9553
| 0.5735
| 0.7568
| 0.91
| 0.8321
| 0.846
| 0.907
| 0.9349
| 0.9092
| 0.8629
| 0.932
| 0.761
| 0.996
| 0.8089
| 0.589
| 0.2755
| 0.7641
| 0.0442
| 0.3868
| 0.9222
| 0.8546
| 16.7948
| 0.8515
| 0.9111
| 12.4946
| 0.7676
| 0.7928
| 7.4672
| 53.967
| 28.2054
| 0.2821
| 45.6313
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 2
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
deep-analysis-research/Flux-Japanese-Qwen2.5-32B-Instruct-V1.0
| 0.742
| 0.761
| 0.6993
| 0.5176
| 0.7995
| 0.9133
| 0.942
| 0.84
| 0.8853
| 0.596
| 0.9262
| 0.2821
| 0.5836
| 0.8717
| 14.1371
| 0.9085
| 0.9556
| 17.9702
| 0.8847
| 0.6993
| 0.9103
| 0.8506
| 0.9972
| 0.9589
| 0.6052
| 0.7741
| 0.9125
| 0.8321
| 0.8342
| 0.9262
| 0.9341
| 0.9093
| 0.8708
| 0.942
| 0.761
| 0.996
| 0.825
| 0.5992
| 0.2879
| 0.7732
| 0.2212
| 0.3916
| 0.9141
| 0.8558
| 17.0122
| 0.8516
| 0.8971
| 12.6149
| 0.715
| 0.7928
| 7.4672
| 53.967
| 28.2054
| 0.2821
| 45.6313
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 2
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/qwen2.5-bakeneko-32b-instruct-v2
| 0.6635
| 0.6386
| 0.5695
| 0.2974
| 0.763
| 0.8954
| 0.932
| 0.8516
| 0.813
| 0.5261
| 0.905
| 0.107
| 0.5797
| 0.8609
| 12.9022
| 0.9096
| 0.9541
| 17.4196
| 0.885
| 0.5695
| 0.9033
| 0.7069
| 0.8583
| 0.958
| 0.5782
| 0.7374
| 0.8977
| 0.7727
| 0.8295
| 0.905
| 0.8831
| 0.875
| 0.8249
| 0.932
| 0.6386
| 0.99
| 0.7887
| 0.4203
| 0.0449
| 0.4216
| 0.0796
| 0.1065
| 0.8346
| 0.8363
| 11.6836
| 0.8471
| 0.9073
| 11.0314
| 0.7647
| 0.7076
| 1.2956
| 31.5854
| 10.7135
| 0.107
| 22.3902
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 9
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/qwen2.5-bakeneko-32b-instruct-v2
| 0.556
| 0.6386
| 0.0426
| 0.1404
| 0.7268
| 0.8753
| 0.776
| 0.8443
| 0.7729
| 0.317
| 0.8756
| 0.107
| 0.2923
| 0.8467
| 11.0959
| 0.9061
| 0.9508
| 15.3114
| 0.8804
| 0.0426
| 0.9078
| 0.7011
| 0.7875
| 0.9339
| 0.2257
| 0.7015
| 0.7551
| 0.7727
| 0.8478
| 0.8756
| 0.8556
| 0.8732
| 0.7843
| 0.776
| 0.6386
| 0.99
| 0.7522
| 0.433
| 0.0092
| 0.0203
| 0.0354
| 0.0142
| 0.6231
| 0.8131
| 9.6657
| 0.8382
| 0.8969
| 9.3358
| 0.7525
| 0.7076
| 1.2956
| 31.5854
| 10.7135
| 0.107
| 22.3902
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 9
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
abeja/ABEJA-Qwen2.5-32b-Japanese-v1.0
| 0.1918
| 0.1185
| 0.0253
| 0.0473
| 0.0212
| 0.6227
| 0
| 0.5374
| 0.2495
| 0.0972
| 0.3338
| 0.0566
| 0.1238
| 0.6655
| 12.7263
| 0.5585
| 0.8015
| 15.4733
| 0.451
| 0.0253
| 0.1195
| 0.5862
| 0
| 0.9491
| 0.1196
| 0.0056
| 0.2995
| 0
| 0.3619
| 0.3338
| 0.9042
| 0.8757
| 0.7994
| 0
| 0.1185
| 0.2831
| 0.0368
| 0.0483
| 0
| 0.0089
| 0
| 0.0007
| 0.227
| 0.6309
| 9.3728
| 0.5872
| 0.8145
| 8.8783
| 0.5528
| 0.6345
| 2.8075
| 16.2878
| 5.6666
| 0.0566
| 13.1605
|
Qwen2ForCausalLM
|
float16
|
apache-2.0
| 32.764
| 5
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
abeja/ABEJA-Qwen2.5-32b-Japanese-v1.0
| 0.6011
| 0.1185
| 0.5672
| 0.1988
| 0.7881
| 0.917
| 0.94
| 0.6372
| 0.8146
| 0.6679
| 0.9066
| 0.0566
| 0.7407
| 0.7057
| 13.8757
| 0.5763
| 0.8715
| 17.0302
| 0.6529
| 0.5672
| 0.9269
| 0.6925
| 0.8681
| 0.9714
| 0.6536
| 0.7769
| 0.8952
| 0.7841
| 0.8332
| 0.9066
| 0.8909
| 0.8775
| 0.8529
| 0.94
| 0.1185
| 0.2831
| 0.7994
| 0.6093
| 0.0853
| 0.1815
| 0.0885
| 0.1164
| 0.5221
| 0.6934
| 12.3998
| 0.6344
| 0.8687
| 11.0177
| 0.685
| 0.6345
| 2.8075
| 16.2878
| 5.6666
| 0.0566
| 13.1605
|
Qwen2ForCausalLM
|
float16
|
apache-2.0
| 32.764
| 5
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.