-
deepseek-ai/DeepSeek-V3-Base
Updated • 18.1k • 1.68k -
TransMLA: Multi-head Latent Attention Is All You Need
Paper • 2502.07864 • Published • 69 -
Qwen2.5 Bakeneko 32b Instruct Awq
⚡2Generate AI-powered Japanese assistant replies
-
Deepseek R1 Distill Qwen2.5 Bakeneko 32b Awq
⚡3Chat with an AI to get detailed text responses
Eduardo Espina
Edespina
·
AI & ML interests
None yet
Organizations
None yet
MWT
-
deepseek-ai/DeepSeek-V3-Base
Updated • 18.1k • 1.68k -
TransMLA: Multi-head Latent Attention Is All You Need
Paper • 2502.07864 • Published • 69 - Running on Zero2
Qwen2.5 Bakeneko 32b Instruct Awq
⚡2Generate AI-powered Japanese assistant replies
- Running on Zero3
Deepseek R1 Distill Qwen2.5 Bakeneko 32b Awq
⚡3Chat with an AI to get detailed text responses
models 0
None public yet
datasets 0
None public yet