Kizagan-E4B-Turkish-Reasoning-Model-GGUF

This is a GGUF quantization of the original Kizagan-E4B-Turkish-Reasoning-Model.

Model Details

Acknowledgements

I would like to thank the creators of the original Kizagan model for their amazing work on Turkish reasoning capabilities.

Downloads last month
130
GGUF
Model size
8B params
Architecture
gemma4
Hardware compatibility
Log In to add your hardware

4-bit

5-bit

6-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support