- Downloads last month
- 47
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for patientxtr/WAN2.2-14B-Rapid-AllInOne-GGUF
Base model
Wan-AI/Wan2.2-I2V-A14B
Finetuned
Phr00t/WAN2.2-14B-Rapid-AllInOne