LTM Context Toggle
Control whether Conversational Search includes Long-Term Memory context in your conversations. When enabled, your chats automatically draw from up to 9 months of captured workflow context.
Enabling or Disabling LTM Context
User profile menu showing LTM-2.7 hover menu with pause and turn off options
You can also set whether LTM context is on by default for new chats in the LLM runtime settings gear.
Starting a Conversation from a Timeline Event
Start context-specific chats directly from any Timeline Event. When you start a conversation with a Timeline Event, it opens in Conversational Search with that event's full context pre-loaded and displayed as an information card.
Timeline Event detail showing Start Related Chat opening Conversational Search with pre-loaded context
You can also use the three-dots menu (â‹®) on any event and choose Chat to scope a conversation to that item. See Chat from a summary.
Chat Pipelines
When you start a New Chat, you can pick a chat pipeline that shapes how the model uses context:
| Pipeline | Type | Use case |
|---|---|---|
| Generally discuss technical topics | Multipurpose | General technical discussion and mixed modalities. |
| Ask questions about a local code base | Project-oriented comprehension | Optimized when LTM has captured relevant IDE or repo activity. |
| Generate code for a local project | Project-oriented generation | Optimized when recent workflow memories include the project. |
Viewing the Relevant Summaries Sidebar
After receiving a response, see exactly which Timeline Events were used to generate it.
Relevant Summaries sidebar showing Timeline Events used to generate the response
Learn how to choose and manage AI models in Models.