New hxa07D family of hybrid models, combining improved RWKV recurrent architectures with Transformer-based attention.
Designed for efficient long-cont
OpenMOSE
OpenMOSE
AI & ML interests
Can love be expressed as a tensor?
Recent Activity
published
a model
5 minutes ago
OpenMOSE/RWKV-Qwen3-32B-hxa07d-L8-GGUF
updated
a model
about 1 hour ago
OpenMOSE/RWKV-Qwen3-32B-hxa07d-L8
published
a model
about 2 hours ago
OpenMOSE/RWKV-Qwen3-32B-hxa07d-L8
Organizations
None yet