Skip to main content
Ollama ExplorerBeta
ReasoningintermediateToolsThinking

magistral

Mistral AIMistral

Magistral is a small, efficient reasoning model with 24B parameters.

1.1M pullsUpdated Jun 26, 20255 tags39K context

Quick start

ollama run magistral

Available sizes

TagSizeQuantizationContextMin RAM
magistral:latest14GBq4_k_m39K context17.5 GB

Run with

Claude Code
ollama launch claude --model magistral
Codex
ollama launch codex --model magistral
OpenCode
ollama launch opencode --model magistral
OpenClaw
ollama launch openclaw --model magistral

Strengths & Limitations

Strengths

  • Efficient reasoning
  • Small model size
  • 24B parameters

Related models