QuantStack / Wan2.2-T2V-A14B-GGUF

huggingface.co
Total runs: 65.9K
24-hour runs: -1.0K
7-day runs: -1.6K
30-day runs: -31.2K
Model's Last Updated: July 29 2025
text-to-video

Introduction of Wan2.2-T2V-A14B-GGUF

Model Details of Wan2.2-T2V-A14B-GGUF

This GGUF file is a direct conversion of Wan-AI/Wan2.2-T2V-A14B

Since this is a quantized model, all original licensing terms and usage restrictions remain in effect.

Usage

The model can be used with the ComfyUI custom node ComfyUI-GGUF by city96

Place model files in ComfyUI/models/unet see the GitHub readme for further installation instructions.

⚠️ Important:

These quantizations were made using the latest version of the tools from ComfyUI-GGUF by city96 .
Please help us testing each one to ensure there are no errors.

Runs of QuantStack Wan2.2-T2V-A14B-GGUF on huggingface.co

65.9K
Total runs
-1.0K
24-hour runs
-1.6K
3-day runs
-1.6K
7-day runs
-31.2K
30-day runs

More Information About Wan2.2-T2V-A14B-GGUF huggingface.co Model

More Wan2.2-T2V-A14B-GGUF license Visit here:

https://choosealicense.com/licenses/apache-2.0

Wan2.2-T2V-A14B-GGUF huggingface.co

Wan2.2-T2V-A14B-GGUF huggingface.co is an AI model on huggingface.co that provides Wan2.2-T2V-A14B-GGUF's model effect (), which can be used instantly with this QuantStack Wan2.2-T2V-A14B-GGUF model. huggingface.co supports a free trial of the Wan2.2-T2V-A14B-GGUF model, and also provides paid use of the Wan2.2-T2V-A14B-GGUF. Support call Wan2.2-T2V-A14B-GGUF model through api, including Node.js, Python, http.

Wan2.2-T2V-A14B-GGUF huggingface.co Url

https://huggingface.co/QuantStack/Wan2.2-T2V-A14B-GGUF

QuantStack Wan2.2-T2V-A14B-GGUF online free

Wan2.2-T2V-A14B-GGUF huggingface.co is an online trial and call api platform, which integrates Wan2.2-T2V-A14B-GGUF's modeling effects, including api services, and provides a free online trial of Wan2.2-T2V-A14B-GGUF, you can try Wan2.2-T2V-A14B-GGUF online for free by clicking the link below.

QuantStack Wan2.2-T2V-A14B-GGUF online free url in huggingface.co:

https://huggingface.co/QuantStack/Wan2.2-T2V-A14B-GGUF

Wan2.2-T2V-A14B-GGUF install

Wan2.2-T2V-A14B-GGUF is an open source model from GitHub that offers a free installation service, and any user can find Wan2.2-T2V-A14B-GGUF on GitHub to install. At the same time, huggingface.co provides the effect of Wan2.2-T2V-A14B-GGUF install, users can directly use Wan2.2-T2V-A14B-GGUF installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

Wan2.2-T2V-A14B-GGUF install url in huggingface.co:

https://huggingface.co/QuantStack/Wan2.2-T2V-A14B-GGUF

Url of Wan2.2-T2V-A14B-GGUF

Wan2.2-T2V-A14B-GGUF huggingface.co Url

Provider of Wan2.2-T2V-A14B-GGUF huggingface.co

QuantStack
ORGANIZATIONS

Other API from QuantStack

huggingface.co

Total runs: 3.5K
Run Growth: -3.5K
Growth Rate: -99.69%
Updated:January 09 2026
huggingface.co

Total runs: 1.5K
Run Growth: -238
Growth Rate: -16.25%
Updated:September 18 2025
huggingface.co

Total runs: 105
Run Growth: 23
Growth Rate: 21.90%
Updated:June 01 2025
huggingface.co

Total runs: 34
Run Growth: 2
Growth Rate: 5.88%
Updated:December 10 2025