Prerequisites
To complete this Quick Guide, you’ll need:
The Pieces Desktop App installed and actively running on your device.
Long-Term Memory enabled in the Pieces Desktop App.
Optional—Pieces connected in a Python IDE via MCP, such as Visual Studio Code or JetBrains IDEs (including PyCharm).
In This Quick Guide
This Quick Guide shows how to combine Long-Term Memory with Conversational Search—including scoping a chat to a specific captured memory using Chat on a Timeline summary—so you can get AI help implementing a feature in a Python app.
As a developer, a common daily task is reviewing a ticket in a tool like GitHub Issues or Jira and then implementing it in a codebase.
This often involves switching back and forth between the code and the ticket, which can affect your productivity due to constant context switching.
With Pieces, Long-Term Memory captures the ticket as you read it. After you work in your project, Pieces can use those captured memories—and you can open Conversational Search scoped to the right summary from Timeline using the Chat action on the three-dots menu.
Review a GitHub Issue
The first step is to review the issue by letting Pieces capture it, and then ask Conversational Search about it.
```plaintext
Summarize the create a sign up page issue I was just reading
```
Pieces will respond with a summary of the issue:
Clone the Project
This issue refers to a sci-fi store—a small web application written in Python and Flask for an upcoming retail store that sells themed sci-fi toys.
If you use an IDE like VS Code or JetBrains PyCharm, open this folder in that IDE.
Tie the project to Conversational Search (Timeline Chat)
You cannot attach a project folder or individual files directly to Conversational Search. Instead, LTM must capture your work in scifi_store, then you scope a chat to the summary or event that holds that context.
For more detail, see [Chat from a summary](/products/desktop/timeline/event-actions#chat-from-a-summary).
Prompt Conversational Search
With the GitHub issue in LTM and a scoped chat tied to a summary that includes your project activity, you can ask Pieces how to implement the issue.
```plaintext
How can I implement this issue in this project?
```
The response may include concrete suggestions such as code for an endpoint, a new page using existing templates, and so on.
Review these code changes along with the original codebase.
<Image src="https://storage.googleapis.com/hashnode_product_documentation_assets/quick_guides/using_pieces_copilot_with_memory_context/new_media/asking_to_fix.png" alt="Conversational Search reasoning over the project and GitHub issue" align="center" fullwidth="true" />
Bonus—Try One Prompt
This Quick Guide showed two prompts: one to get the details about the issue and another to learn how to implement it.
This was done in 2 stages to illustrate the information from the Pieces Long-Term Memory, but is unnecessary. You can do this in a single prompt!
```plaintext
How can I implement the create a sign up page issue I was just reading in this Python project?
```
<Image src="https://storage.googleapis.com/hashnode_product_documentation_assets/quick_guides/using_pieces_copilot_with_memory_context/new_media/extra_example.png" alt="Conversational Search implementing an issue with a single prompt" align="center" fullwidth="true" />