Skip to main content
Ollama ExplorerBeta
Generalintermediate

vicuna

MetaVicuna

General use chat model based on Llama and Llama 2 with 2K to 16K context sizes.

482K pullsUpdated Feb 26, 2024111 tags4K context

Quick start

ollama run vicuna

Available sizes

TagSizeQuantizationContextMin RAM
vicuna:latest3.8GBq4_k_m4K context4.8 GB
vicuna:13b7.4GBq4_k_m4K context9.2 GB
vicuna:33b18GBq4_k_m2K context22.5 GB

Strengths & Limitations

Strengths

  • General Chat
  • Long Context Handling
  • Based on Llama 2

Related models