Skip to content

SmolLM

by Hugging Face · Website

Hugging Face's SmolLM family of compact language models designed for on-device and edge deployment. Available in sizes from 135M to 1.7B parameters, SmolLM2 models are trained on trillions of tokens and deliver strong performance on reasoning, knowledge, and instruction-following tasks despite their small footprint.

Variants (1)

SmolLM2 1.7B

1.7B

Hugging Face

Min 1.9 GB
text-generation code-generation summarization