Long-Term Memory Prompting Guide

Learn some top tricks and prompt suggestions to get the most out of the Pieces Long-Term Memory and Workstream Activities view.


Why This Matters

Over the years, LLM users have learned how to prompt LLMs effectively, using tricks like prompt chaining.

With Pieces Long-Term Memory, you can add more dimensions to your prompts, querying across captured memories using keywords related to your activities, applications, or time periods.

These guides show how to query the stored LTM context using the Pieces Copilot in the Pieces Desktop App or any app with a Pieces plugin or extension.

When using these prompts, ensure you have LTM-2.5 turned on, both the LTM, and the LTM context source in the copilot chat.

These guides introduce some of the ways you can query the databased of stored LTM context using the Pieces Copilot in the Pieces Desktop App, or any of the applications you use that have a Pieces plugin or extension.

Click one of the cards below to jump to that guide.

General Long-Term Memory Prompting Tips

Some general prompting tips to help you get the most out of Pieces.

Updated on