Pieces for Developers | LLMs

Pieces for Developers software products provide user access to 54 total large language models (LLMs) from a wide variety of providers—and we’re adding more and more!



Ready for the next level?
Ready for the next level?

Unlock the latest, most powerful AI models from OpenAI, Claude, and more with Pieces Pro.

Compatible AI Models with Pieces

Pieces utilizes cloud-hosted LLMs from providers like OpenAI, Anthropic, and Google. All local models are served through Ollama, a core dependency of PiecesOS and the rest of the Pieces Suite.

Cloud-Only LLMs | Providers

Browse the list of cloud-hosted AI models available for use with the Pieces Desktop App and several of our plugins and extensions.


Provider

Model Name

OpenAI

GPT-X

Anthropic

Claude / Sonnet / Opus / Haiku

Google

Gemini / Pro / Flash / Chat


Local-Only LLMs | Providers

Read through the list of local AI models available for use within the Pieces Desktop App and the rest of the Pieces Suite.


Provider

Model Name

Google

Gemma / Code Gemma

IBM

Granite / Code / Dense / MoE

Meta

LLaMA / CodeLLaMA

Mistral

Mistral / Mixtral

Microsoft

Phi

Qwen

QwQ / Coder

StarCoder

StarCoder


Using Ollama with Pieces

Ollama is required to utilize local generative AI features.

It’s a lightweight system that allows for a plug-and-play experience with local models, meaning that Pieces can update the number of compatible models we serve at lightning-speeds!

If you want to read more about installing and using Ollama with Pieces, click here for the Ollama section of our Core Dependencies documentation.

Updated on