Kizagan-E4B-Turkish-Reasoning-Model-GGUF
This is a GGUF quantization of the original Kizagan-E4B-Turkish-Reasoning-Model.
Model Details
- Original Model: AlicanKiraz/Kizagan-E4B
- Quantized by: naysflow
- Method: Quantization using
llama.cpp
Acknowledgements
I would like to thank the creators of the original Kizagan model for their amazing work on Turkish reasoning capabilities.
- Downloads last month
- 130
Hardware compatibility
Log In to add your hardware
4-bit
5-bit
6-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support