Skip to main content
Ollama ExplorerBeta
LanguageintermediateTools

qwen2.5

Alibaba CloudQwen

Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset, encompassing up to 18 trillion tokens. The model supports up to 128K tokens and has multilingual support.

22.0M pullsUpdated Feb 26, 2025133 tags32K context

Quick start

ollama run qwen2.5

Available sizes

TagSizeQuantizationContextMin RAM
qwen2.5:3b1.9GBq4_k_m32K context2.4 GB
qwen2.5:latest4.7GBq4_k_m32K context5.9 GB
qwen2.5:14b9.0GBq4_k_m32K context11.2 GB
qwen2.5:32b20GBq4_k_m32K context25 GB
qwen2.5:72b47GBq4_k_m32K context58.8 GB

Run with

Claude Code
ollama launch claude --model qwen2.5
Codex
ollama launch codex --model qwen2.5
OpenCode
ollama launch opencode --model qwen2.5
OpenClaw
ollama launch openclaw --model qwen2.5

Strengths & Limitations

Strengths

  • Large-scale pretraining
  • Long context window (128K tokens)
  • Multilingual support

Related models