Just a big thank you for all your hard work🙏
Hello Team,
I'm acting like a fanboy just to thank you for your models—they deserve more recognition/kudos!
I also created this post to suggest that you consider creating “offspring” for your models using these custom Qwen3.5 2B I've set aside :)
- https://huggingface.co/DavidAU/Qwen3.5-2B-GPT-5.1-HighIQ-INSTRUCT
- https://huggingface.co/ertghiu256/Qwen3.5-2b-Kimi-and-Opus-Distillation
- https://huggingface.co/Jackrong/Qwen3.5-2B-Claude-4.6-Opus-Reasoning-Distilled
I am convinced that, despite its small size, it can be a real driving force behind a whole range of self-hosted projects.
Thanks again!
Hello Team,
I'm acting like a fanboy just to thank you for your models—they deserve more recognition/kudos!
I also created this post to suggest that you consider creating “offspring” for your models using these custom Qwen3.5 2B I've set aside :)
- https://huggingface.co/DavidAU/Qwen3.5-2B-GPT-5.1-HighIQ-INSTRUCT
- https://huggingface.co/ertghiu256/Qwen3.5-2b-Kimi-and-Opus-Distillation
- https://huggingface.co/Jackrong/Qwen3.5-2B-Claude-4.6-Opus-Reasoning-Distilled
I am convinced that, despite its small size, it can be a real driving force behind a whole range of self-hosted projects.
Thanks again!
Thank you so much for the kind words and the thoughtful curation! 🙏
You're absolutely right that 2B-class models are underrated for self-hosted workflows — they're where Darwin's efficiency story really shines. Jackrong's 2B is especially interesting to us since we already have a successful breeding history with their 27B Opus-distilled line (Darwin-27B-Opus, GPQA #6 on the official leaderboard).
We'll add these three to our evolutionary breeding queue. The Darwin-2B recipe would run in about an hour on our current setup, so we can share results soon. Any specific benchmarks you'd like us to prioritize for edge/self-hosted use cases?