Quick start
ollama run nomic-embed-text-v2-moeAvailable sizes
| Tag | Size | Quantization | Context | Min RAM |
|---|
Strengths & Limitations
Strengths
- Multilingual retrieval
- MoE architecture
- Text embedding
Related models
nomic-embed-textEmbedding
A high-performing open embedding model with a large token context window.
54.8M pullsmxbai-embed-largeEmbedding
State-of-the-art large embedding model from mixedbread.ai
7.6M pullsbge-m3Embedding
BGE-M3 is a new model from BAAI distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity.
3.3M pullsall-minilmEmbedding
Embedding models on very large sentence level datasets.
2.5M pulls