Skip to main content
Ollama ExplorerBeta
LanguageintermediateTools

mixtral

Mistral AIMistral

A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

1.9M pullsUpdated Feb 26, 202570 tags64K context

Quick start

ollama run mixtral

Available sizes

TagSizeQuantizationContextMin RAM
mixtral:latest26GBq4_k_m32K context32.5 GB
mixtral:8x22b80GBq4_k_m64K context100 GB

Run with

Claude Code
ollama launch claude --model mixtral
Codex
ollama launch codex --model mixtral
OpenCode
ollama launch opencode --model mixtral
OpenClaw
ollama launch openclaw --model mixtral

Strengths & Limitations

Strengths

  • Open weights for customization
  • Strong performance due to Mixture of Experts architecture
  • Available in multiple parameter sizes

Related models