StarCoder2 3B
by BigCode · starcoder family
3B
parameters
code-generation text-generation
StarCoder2 3B is the smallest model in the BigCode StarCoder2 family, trained on The Stack v2 dataset covering over 600 programming languages. Despite its compact size, it delivers surprisingly capable code generation and completion, making it ideal for resource-constrained environments. The model supports 16K context and fill-in-the-middle capabilities for IDE-style code completion. Its small footprint means it can run on virtually any modern GPU, providing fast inference for everyday coding tasks without requiring significant hardware investment.
Quick Start with Ollama
ollama run 3b-q4_K_M | Creator | BigCode |
| Parameters | 3B |
| Architecture | transformer-decoder |
| Context | 16K tokens |
| Released | Feb 28, 2024 |
| License | BigCode Open RAIL-M v1 |
| Ollama | starcoder2:3b |
Quantization Options
| Format | File Size | VRAM Required | Quality | Ollama Tag |
|---|---|---|---|---|
| Q4_K_M rec | 1.8 GB | 3.5 GB | | 3b-q4_K_M |
| Q8_0 | 3.2 GB | 5 GB | | 3b-q8_0 |
| F16 | 6 GB | 8 GB | | 3b-f16 |
Compatible Hardware
Q4_K_M requires 3.5 GB VRAM
Benchmark Scores
35.0
mmlu