Compatible AI Models with Pieces
Pieces utilizes cloud-hosted LLMs from providers like OpenAI, Anthropic, and Google. Local models can optionally be downloaded and served through PiecesOS and the rest of the Pieces Suite.
Cloud-Only LLMs | Providers
Browse the list of cloud-hosted AI models available for use with the Pieces Desktop App and several of our plugins and extensions.
| Provider | Model Name |
|---|---|
| OpenAI | GPT-X |
| Anthropic | Claude / Sonnet / Opus / Haiku |
| Gemini / Pro / Flash / Chat |
Local-Only LLMs | Providers
Read through the list of local AI models available for use within the Pieces Desktop App and the rest of the Pieces Suite.
| Provider | Model Name |
|---|---|
| Gemma / Code Gemma | |
| IBM | Granite / Code / Dense / MoE |
| Meta | LLaMA / CodeLLaMA |
| Mistral | Mistral / Mixtral |
| Microsoft | Phi |
| Qwen | QwQ / Coder |
| StarCoder | StarCoder |
Using Local Models with Pieces
Local models can be optionally downloaded to utilize on-device generative AI features.
PiecesOS provides a seamless experience with local models, allowing you to work entirely offline with complete privacy.
Local models are automatically available once PiecesOS is installed. Download and manage them through the Pieces Copilot LLM selector in any Pieces product.