StarCoder2 7B
by BigCode · starcoder family
7B
parameters
code-generation text-generation
StarCoder2 7B is the mid-sized model in the BigCode StarCoder2 family, offering a strong balance between code generation quality and resource efficiency. Trained on The Stack v2 dataset with over 600 programming languages, it provides noticeably better code quality than the 3B variant while remaining accessible on consumer GPUs. With 16K context support and fill-in-the-middle capabilities, StarCoder2 7B is well-suited for local coding assistance, code completion, and generation tasks. Its permissive license makes it a practical choice for both personal and commercial use.
Quick Start with Ollama
ollama run 7b-q4_K_M | Creator | BigCode |
| Parameters | 7B |
| Architecture | transformer-decoder |
| Context | 16K tokens |
| Released | Feb 28, 2024 |
| License | BigCode Open RAIL-M v1 |
| Ollama | starcoder2:7b |
Quantization Options
| Format | File Size | VRAM Required | Quality | Ollama Tag |
|---|---|---|---|---|
| Q4_K_M rec | 4.4 GB | 5.5 GB | | 7b-q4_K_M |
| Q8_0 | 7.4 GB | 9 GB | | 7b-q8_0 |
| F16 | 14 GB | 16 GB | | 7b-f16 |
Compatible Hardware
Q4_K_M requires 5.5 GB VRAM
Benchmark Scores
35.0
mmlu