Skip to main content
Ollama ExplorerBeta
GeneralintermediateVisionCloud

gemma3

Google DeepMindGemma

The current, most capable model that runs on a single GPU.

32.1M pullsUpdated Dec 26, 202529 tags128K context

Quick start

ollama run gemma3

Available sizes

TagSizeQuantizationContextMin RAM
gemma3:latest3.3GBq4_k_m128K context4.1 GB
gemma3:12b8.1GBq4_k_m128K context10.1 GB
gemma3:27b17GBq4_k_m128K context21.2 GB

Strengths & Limitations

Strengths

  • Runs on a single GPU
  • Most capable model in its class
  • Efficient performance

Benchmarks

BenchmarkScoreUnit
HellaSwag0
PIQA0
ARC-c0
WinoGrande0
HellaSwag10
ARC-c25
ARC-e0

Related models