Skip to main content
Ollama ExplorerBeta
LanguagebeginnerTools

granite3-moe

IBMGranite

The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage.

363K pullsUpdated Feb 26, 202533 tags4K context

Quick start

ollama run granite3-moe

Available sizes

TagSizeQuantizationContextMin RAM
granite3-moe:3b2.1GBq4_k_m4K context2.6 GB

Run with

Claude Code
ollama launch claude --model granite3-moe
Codex
ollama launch codex --model granite3-moe
OpenCode
ollama launch opencode --model granite3-moe
OpenClaw
ollama launch openclaw --model granite3-moe

Strengths & Limitations

Strengths

  • Low latency
  • Mixture of Experts (MoE)
  • Designed for efficient usage

Related models