Skip to main content
Ollama ExplorerBeta
LanguageintermediateTools

granite3.1-moe

IBMGranite

The IBM Granite 1B and 3B models are long-context mixture of experts (MoE) Granite models from IBM designed for low latency usage.

2.2M pullsUpdated Feb 26, 202533 tags128K context

Quick start

ollama run granite3.1-moe

Available sizes

TagSizeQuantizationContextMin RAM
granite3.1-moe:1b1.4GBq4_k_m128K context1.8 GB
granite3.1-moe:latest2.0GBq4_k_m128K context2.5 GB

Run with

Claude Code
ollama launch claude --model granite3.1-moe
Codex
ollama launch codex --model granite3.1-moe
OpenCode
ollama launch opencode --model granite3.1-moe
OpenClaw
ollama launch openclaw --model granite3.1-moe

Strengths & Limitations

Strengths

  • Long context processing
  • Low latency
  • Mixture of Experts architecture

Related models