Model
stringclasses 5
values | Quantization
stringclasses 4
values | GPUs
stringlengths 1
36
| CPUs
stringclasses 9
values | RAM GB
stringclasses 5
values | Context Length
stringclasses 3
values | Total Tokens
stringlengths 1
3
| Duration (sec)
stringlengths 1
7
| Tokens/sec
float64 0.82
145
| Time to Retrieve (sec)
stringlengths 1
7
| Start Type
stringclasses 3
values | GPU Max Power (W)
stringclasses 3
values | Server Name
stringlengths 3
8
| Test Link
stringlengths 1
38
| Comment
stringclasses 2
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
llama3.1:8b-instruct
|
q4_0
|
1x GTX 1050Ti 4GB
|
1x Intel i7
|
32
|
4000
|
243
|
37.38
| 6.5
|
105.62
|
Cold
|
-
|
pasha-lt
|
-
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x GTX 1050Ti 4GB
|
1x Intel i7
|
32
|
4000
|
248
|
38.20
| 6.49
|
38.48
|
Warm
|
-
|
pasha-lt
|
-
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 3050 8GB
|
2x Xeon 2960v2
|
128
|
4000
|
234
|
6.03
| 38.83
|
9.27
|
Cold
|
-
|
gpu01
|
-
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 3050 8GB
|
2x Xeon 2960v2
|
128
|
4000
|
303
|
7.82
| 38.73
|
7.97
|
Warm
|
-
|
gpu01
|
-
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 3050 8GB
|
2x Xeon 2960v2
|
128
|
4000
|
500
|
608.09
| 0.82
|
628.27
|
Cold
|
-
|
gpu01
|
-
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 3050 8GB
|
2x Xeon 2960v2
|
128
|
4000
|
500
|
608.47
| 0.82
|
609.85
|
Warm
|
-
|
gpu01
|
-
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
4000
|
282
|
1.95
| 144.66
|
44.73
|
Cold
|
-
|
gpu02
|
-
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
4000
|
316
|
2.19
| 144.62
|
2.32
|
Warm
|
-
|
gpu02
|
-
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
4000
|
500
|
308.62
| 1.62
|
412.98
|
Cold
|
-
|
gpu02
|
-
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
4000
|
500
|
297.64
| 1.68
|
305.18
|
Cold
|
-
|
gpu02
|
-
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
4000
|
500
|
297.75
| 1.68
|
298.44
|
Warm
|
-
|
gpu02
|
-
|
-
|
llama3.1:8b-instruct
|
fp16
|
1x Tesla P40 24GB
|
1x Xeon E5-2680 v4
|
128
|
4000
|
229
|
14.26
| 16.06
|
38.33
|
-
|
135
|
nn01
|
https://t.me/evilfreelancer_chat/73576
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x Tesla P40 24GB
|
1x Xeon E5-2680 v4
|
128
|
4000
|
345
|
8.39
| 41.14
|
8.54
|
-
|
135
|
nn01
|
https://t.me/evilfreelancer_chat/73622
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x Tesla P40 24GB
|
1x Xeon E5-2680 v4
|
128
|
4000
|
225
|
5.07
| 44.39
|
7.74
|
-
|
250
|
nn01
|
https://t.me/evilfreelancer_chat/73668
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 4060 16GB
|
1x Intel i7
|
16
|
4000
|
471
|
10.15
| 46.4
|
15.33
|
-
|
-
|
sn01
|
https://t.me/evilfreelancer_chat/73684
|
-
|
aya:35b-23
|
q4_K_M
|
1x Tesla P40 24GB
|
1x Xeon E5-2680 v4
|
128
|
200
|
300
|
30.32
| 9.9
|
39.38
|
-
|
-
|
nn01
|
https://t.me/evilfreelancer_chat/73749
|
-
|
aya:35b-23
|
q4_K_M
|
3x RTX 4060 16GB
|
2х Xeon E5-2690 v1
|
128
|
4000
|
327
|
28.23
| 11.59
|
38.81
|
Cold
|
-
|
sn02
|
https://t.me/evilfreelancer_chat/73748
|
-
|
aya:35b-23
|
q4_K_M
|
3x RTX 4060 16GB
|
2х Xeon E5-2690 v1
|
128
|
4000
|
324
|
27.93
| 11.6
|
28.22
|
Warm
|
-
|
sn02
|
https://t.me/evilfreelancer_chat/73748
|
-
|
aya:35b-23
|
q4_K_M
|
3x RTX 4060 16GB
|
2х Xeon E5-2690 v1
|
128
|
200
|
307
|
26.39
| 11.63
|
45.27
|
-
|
-
|
sn02
|
https://t.me/evilfreelancer_chat/73762
|
-
|
aya:35b-23
|
q4_K_M
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
200
|
325
|
8.46
| 38.44
|
13.19
|
Cold
|
-
|
gpu02
|
https://t.me/evilfreelancer_chat/73763
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 8000 48GB
|
1x i9-13900KF
|
-
|
-
|
-
|
-
| 13.1
|
-
|
-
|
-
|
mn1
|
https://t.me/evilfreelancer_chat/74682
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 8000 48GB
|
1x i9-13900KF
|
-
|
-
|
-
|
-
| 90
|
-
|
-
|
-
|
mn1
|
https://t.me/evilfreelancer_chat/74682
|
-
|
mistral-small:22b-instruct-2409
|
q4_0
|
1x RTX 4060 16GB
|
1x Intel i7
|
16
|
200
|
500
|
24.28
| 20.59
|
29.88
|
Cold
|
-
|
sn01
|
-
|
-
|
mistral-small:22b-instruct-2409
|
q4_0
|
1x RTX 4060 16GB
|
1x Intel i7
|
16
|
200
|
500
|
24.22
| 20.63
|
24.42
|
Warm
|
-
|
sn01
|
-
|
-
|
mistral-small:22b-instruct-2409
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
200
|
500
|
8.24
| 60.63
|
10.16
|
Cold
|
-
|
gpu02
|
https://t.me/evilfreelancer_chat/75209
|
-
|
mistral-small:22b-instruct-2409
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
200
|
500
|
8.24
| 60.6
|
8.36
|
Warm
|
-
|
gpu02
|
https://t.me/evilfreelancer_chat/75209
|
-
|
mixtral:8x7b-instruct-v0.1
|
q4_0
|
3x RTX 4060 16GB
|
2х Xeon E5-2690 v1
|
128
|
200
|
367
|
11.77
| 31.16
|
119.34
|
Cold
|
-
|
sn02
|
https://t.me/evilfreelancer_chat/75228
|
-
|
mixtral:8x7b-instruct-v0.1
|
q4_0
|
3x RTX 4060 16GB
|
2х Xeon E5-2690 v1
|
128
|
200
|
471
|
15.09
| 31.19
|
15.28
|
Warm
|
-
|
sn02
|
https://t.me/evilfreelancer_chat/75228
|
-
|
mixtral:8x7b-instruct-v0.1
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
200
|
313
|
15.2
| 20.58
|
20.11
|
Cold
|
-
|
gpu02
|
-
|
-
|
mixtral:8x7b-instruct-v0.1
|
q4_0
|
1x RTX 4090 24GB
|
1x Ryzen 9 5950x
|
128
|
200
|
500
|
24.42
| 20.47
|
24.76
|
Warm
|
-
|
gpu02
|
-
|
-
|
llama3.1:70b-instruct
|
q8_0
|
2x RTX 8000 48GB
|
1x i9-13900KF
|
-
|
-
|
-
|
-
| 7
|
-
|
-
|
-
|
mn1
|
https://t.me/evilfreelancer_chat/75552
|
-
|
mistral-small:22b-instruct-2409
|
q4_0
|
1x RTX 8000 48GB
|
1x i9-13900KF
|
-
|
-
|
-
|
-
| 37
|
-
|
-
|
-
|
mn1
|
https://t.me/evilfreelancer_chat/75557
|
-
|
llama3.1:70b-instruct
|
q8_0
|
4x RTX 4090 24GB
|
-
|
-
|
-
|
-
|
-
| 11
|
-
|
-
|
-
|
mn2
|
https://t.me/evilfreelancer_chat/75724
|
-
|
llama3.1:8b-instruct
|
fp16
|
4x RTX 4090 24GB
|
-
|
-
|
-
|
-
|
-
| 51
|
-
|
-
|
-
|
mn2
|
https://t.me/evilfreelancer_chat/75741
|
-
|
llama3.1:8b-instruct
|
q4_0
|
-
|
1x Xeon E5-2680 v4
|
-
|
-
|
238
|
38.3
| 6.21
|
38.59
|
Warm
|
-
|
nn01
|
https://t.me/evilfreelancer_chat/75771
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
4000
|
275
|
2.55
| 107.8
|
5.21
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75772
|
-
|
llama3.1:8b-instruct
|
fp16
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
4000
|
229
|
5.05
| 45.31
|
10.17
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75772
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
4000
|
496
|
26.25
| 18.89
|
37.34
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75772
|
-
|
aya:35b-23
|
q4_K_M
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
4000
|
368
|
11.34
| 32.44
|
20.14
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75772
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
200
|
260
|
2.35
| 110.41
|
4.82
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75779
|
-
|
llama3.1:8b-instruct
|
fp16
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
200
|
242
|
5.3
| 45.64
|
10.39
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75779
|
-
|
llama3.1:70b-instruct
|
q4_0
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
200
|
354
|
18.65
| 18.97
|
31.69
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75779
|
-
|
aya:35b-23
|
q4_K_M
|
1x RTX 3090 Ti 24GB
1x RTX 3090 24GB
|
1x Threadripper 1920X
|
-
|
200
|
321
|
9.54
| 33.63
|
18.58
|
-
|
-
|
tig1
|
https://t.me/evilfreelancer_chat/75779
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x RX 7900 XTX 24GB
|
-
|
-
|
200
|
500
|
5.47
| 91.3
|
11.36
|
-
|
-
|
fp1
|
https://t.me/evilfreelancer_chat/76808
|
-
|
llama3.1:8b-instruct
|
q4_0
|
5x GTX 1060 6GB
|
1x i5-4570
|
8
|
200
|
244
|
9.4
| 25.94
|
19.38
|
-
|
-
|
sh1
|
https://t.me/evilfreelancer_chat/81926
|
-
|
llama3.1:8b-instruct
|
q4_0
|
1x Intel Arc A770 16GB
|
-
|
-
|
200
|
372
|
7.35992
| 50.544
|
23.2908
|
-
|
-
|
fp2
|
https://t.me/c/1564516454/155938
|
Windows, драйвер 6795
ipex-llm (sycl)
|
llama3.1:8b-instruct
|
q4_0
|
1x Intel Arc A770 16GB
|
-
|
-
|
4000
|
500
|
9.94434
| 50.2799
|
24.0089
|
-
|
-
|
fp2
|
https://t.me/c/1564516454/155938
|
Windows, драйвер 6795
ipex-llm (sycl)
|
README.md exists but content is empty.
- Downloads last month
- 16