-
-
-
-
-
-
Inference Providers
Active filters:
sea
Text Generation
•
7B
•
Updated
•
8.18k
•
68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated
•
7
•
3
LoneStriker/SeaLLM-7B-v2-GGUF
7B
•
Updated
•
86
•
6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
•
Updated
•
3
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
•
Updated
•
6
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
•
Updated
•
5
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
•
Updated
•
7
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
•
Updated
•
5
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
•
7B
•
Updated
•
5
Text Generation
•
8B
•
Updated
•
248
•
28
Text Generation
•
4B
•
Updated
•
110
•
6
Text Generation
•
2B
•
Updated
•
80
•
•
8
Text Generation
•
0.6B
•
Updated
•
94
•
9
Text Generation
•
8B
•
Updated
•
19
•
7
Text Generation
•
4B
•
Updated
•
23
•
2
Text Generation
•
2B
•
Updated
•
35
•
•
5
Text Generation
•
0.6B
•
Updated
•
33
•
7
sail/Sailor-1.8B-Chat-gguf
2B
•
Updated
•
300
•
3
sail/Sailor-0.5B-Chat-gguf
0.6B
•
Updated
•
692
•
4
4B
•
Updated
•
310
•
3
8B
•
Updated
•
550
•
5
Text Generation
•
9B
•
Updated
•
11k
•
50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B
•
Updated
•
143
•
8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
•
2B
•
Updated
•
5
•
2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B
•
Updated
•
11
•
1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
•
9B
•
Updated
•
139
•
1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
•
7B
•
Updated
•
169
•
1
Image-to-Text
•
8B
•
Updated
•
8
•
5
NghiemAbe/SeaLLM-7B-v2.5-AWQ
Text Generation
•
Updated
•
7