Skip to main content
Ollama ExplorerBeta
Languageintermediate

llama3-gradient

MetaLlama

This model extends LLama-3 8B's context length from 8k to over 1m tokens.

391K pullsUpdated Feb 26, 202535 tags1M context

Quick start

ollama run llama3-gradient

Available sizes

TagSizeQuantizationContextMin RAM
llama3-gradient:latest4.7GBq4_k_m1M context5.9 GB
llama3-gradient:70b40GBq4_k_m1M context50 GB

Strengths & Limitations

Strengths

  • Extended Context Length
  • Leverages Llama-3 8B
  • Handles long-form content

Related models