Get Started

Integrating the Pieces MCP with Google Gemini CLI brings your workflow context directly into the Gemini command-line agent. Gemini CLI supports stdio, SSE, and Streamable HTTP.

**Key distinction:** For HTTP servers, use `httpUrl`; for SSE servers, use `url`—they are different fields in the Gemini config.

Prerequisites

There are two prerequisites for integrating Pieces with Gemini CLI as an MCP—an active instance of PiecesOS and the fully-enabled Long-Term Memory engine.

Make sure that PiecesOS is installed and running. This is *required* for the MCP server to communicate with your workflow data. Enable the Long-Term Memory Engine (LTM-2.7) through the Pieces Desktop App or the [PiecesOS Quick Menu](/products/core-dependencies/pieces-os/quick-menu) in your toolbar.

Installing PiecesOS & Configuring Permissions

Follow the instructions below for a detailed guide on setting up and configuring PiecesOS to correctly pass captured workflow context to the Pieces MCP server.

Setting Up Google Gemini CLI

Gemini CLI stores its MCP configuration in ~/.gemini/settings.json. Workspace settings override user settings.

Config File Location

Scope Path
User (global) ~/.gemini/settings.json
Workspace YOUR_PROJECT/.gemini/settings.json

Local Setup (Streamable HTTP — recommended)

Edit ~/.gemini/settings.json:

{
  "mcpServers": {
    "pieces": {
      "httpUrl": "http://localhost:39300/model_context_protocol/2025-03-26/mcp",
      "timeout": 30000
    }
  }
}

Local Setup (SSE)

For SSE, use url instead of httpUrl:

{
  "mcpServers": {
    "pieces": {
      "url": "http://localhost:39300/model_context_protocol/2024-11-05/sse",
      "timeout": 30000
    }
  }
}

Using Pieces MCP Server in Gemini CLI

Once integrated, you can utilize Pieces LTM directly in Gemini CLI.

Run `gemini` in your terminal. Gemini discovers and registers tools from MCP servers on startup. Ask: *"What Pieces tools are available?"* Ask context-rich questions about your workflow. Check out this [MCP-specific prompting guide](/products/mcp/prompting) if you want to effectively utilize the Long-Term Memory Engine (LTM-2.7) with your new Pieces MCP server.

Updating

Edit ~/.gemini/settings.json, update the URL, and start a new Gemini CLI session.

Troubleshooting

If you're experiencing issues integrating Pieces MCP with Gemini CLI:

  1. Verify PiecesOS Status: Ensure PiecesOS is actively running on your system.

  2. HTTP vs SSE Key: Use httpUrl for Streamable HTTP, url for SSE—they are different fields.

  3. Settings File Not Found: Create ~/.gemini/settings.json manually.

  4. Tools Not Discovered: Restart Gemini CLI after editing settings.

  5. OAuth Prompt: Pieces does not require OAuth. Dismiss if prompted.


You're now set to enhance your Gemini CLI workflow with powerful context retrieval through Pieces MCP. Happy coding!