Just a big thank you for all your hard work🙏

#1
by Cheuv - opened

Hello Team,
I'm acting like a fanboy just to thank you for your models—they deserve more recognition/kudos!
I also created this post to suggest that you consider creating “offspring” for your models using these custom Qwen3.5 2B I've set aside :)

FINAL_Bench org

Hello Team,
I'm acting like a fanboy just to thank you for your models—they deserve more recognition/kudos!
I also created this post to suggest that you consider creating “offspring” for your models using these custom Qwen3.5 2B I've set aside :)

Thank you so much for the kind words and the thoughtful curation! 🙏

You're absolutely right that 2B-class models are underrated for self-hosted workflows — they're where Darwin's efficiency story really shines. Jackrong's 2B is especially interesting to us since we already have a successful breeding history with their 27B Opus-distilled line (Darwin-27B-Opus, GPQA #6 on the official leaderboard).

We'll add these three to our evolutionary breeding queue. The Darwin-2B recipe would run in about an hour on our current setup, so we can share results soon. Any specific benchmarks you'd like us to prioritize for edge/self-hosted use cases?

Sign up or log in to comment