Skip to main content
Ollama ExplorerBeta
Languageadvanced

deepseek-v3

DeepSeekDeepSeek

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

3.5M pullsUpdated Feb 26, 20255 tags160K context

Quick start

ollama run deepseek-v3

Available sizes

TagSizeQuantizationContextMin RAM
deepseek-v3:latest404GBq4_k_m160K context505 GB

Strengths & Limitations

Strengths

  • Large parameter size enables complex reasoning.
  • Mixture-of-Experts architecture improves efficiency.
  • Strong performance on various language tasks.

Related models