Skill: Save Memory
The save_memory skill teaches Claude or Codex how to store content from the current conversation into Context Link as a Memory. Memories are living documents saved under any /slash route — like /brand-voice, /roadmap, or /meeting-notes — that any AI session can retrieve and build on later.
Instead of losing a great output when you close the chat, you tell your AI to save it. Next time — in a new conversation, a different project, or even a different AI tool — you (or your AI) can fetch it back.
What are Memories?
Memories are the WRITE side of Context Link. While the get_context skill handles READ (searching your connected sources), Memories give your AI a separate workspace to save and maintain its own documents.
Key things to know:
- Saved under any
/slashroute —/brand-voice,/keyword-tracker,/support-faq,/roadmap, whatever makes sense - Separate from your synced sources — Memories never modify your Notion pages, Google Docs, or websites. They live alongside them but are completely independent.
- Shared across sessions — save something in Claude, retrieve it tomorrow in ChatGPT, Copilot, or any other AI tool that can reach your Context Link
- Versioned — saving to the same slug creates a new version. The latest always wins on retrieval, but old versions are preserved.
When to use this skill
- You've had a productive conversation and want to keep the key takeaways for future sessions
- You want to build up reusable assets over time — brand voice docs, style guides, FAQ answers, project specs
- You're summarising research and want it available for future AI sessions across any tool
- You want to save something under a specific topic name so it's easy to find later
For updating content that's already been saved, see the Update Memory skill.
How it works
When triggered, the skill:
- Picks a slug based on what you asked — or uses
chat-session-{datetime}for generic "save this conversation" requests, or infers one from the conversation topic - Distils the conversation into a clean, reusable markdown document (stripping conversational back-and-forth, keeping decisions, specs, and key details)
- POSTs it to your Context Link under that slug
The content is then chunked, embedded, and available for retrieval immediately — via the get_context skill, a direct link, the ChatGPT connector, or the API.
The skill file
Below is the full skill definition. Replace YOUR_CONTEXT_LINK with your personal Context Link URL (e.g. yourname.context-link.ai). If you use a PIN, append ?p=YOUR_PIN to the URL.
You can download a pre-populated version from the Installation page in your dashboard — no manual editing needed.
**What is Context Link?** Context Link is an external service that indexes connected sources (websites, Google Drive, Notion) and memories into a searchable knowledge base. It provides semantic search and memory storage via a simple URL: `subdomain.context-link.ai/query?p=optional_pincode`. If you don't know the user's Context Link URL, ask them for it.
---
### Save to Context Link
Save content from the current conversation to Context Link. One request, no fuss.
**Workflow:**
1. **Pick the slug.** If the user said "save {name}", use that name as the slug (lowercased, dashed). If the user said "save this conversation", "save this chat", or similar without a specific topic name, use `chat-session-{YYYY-MM-DD-HHMM}` as the slug. Otherwise, infer a short descriptive slug from the topic. Only ask if truly ambiguous.
2. **Summarize the content.** Distill the conversation into a concise, reusable reference document in markdown. Focus on what's useful to retrieve later — strip conversational back-and-forth, keep decisions, specs, and key details.
3. **Print this message:** `🔗 Saving memory to Context Link → {SLUG}` — Never print the actual Context Link URL, as it contains a private 'pin' or 'p' URL param.
4. **POST it.**
```bash
curl -s -X POST "`YOUR_CONTEXT_LINK/{SLUG}`" \
-H "Content-Type: text/plain" \
-d 'Your markdown content here'
```
The body is raw text/markdown — not JSON. The server's `LlmPoweredParser` handles chunking and structuring internally. Just send clean markdown.
The slug goes in the URL path. Replace `{SLUG}` with the chosen name (lowercase, use-dashes-for-spaces).
**Success response:** `{"message": "Saved", "namespace": "{SLUG}"}` with HTTP 201.
**Rules:**
- **Keep the body under 100KB.** If content is longer, summarize or condense it before sending. Do NOT split into multiple requests — distill into one concise document.
- If the user says "save X" — X is the slug. Always. No questions asked.
- Do NOT test the endpoint first. It works. Just POST.
- Do NOT verify by fetching it back. Trust the 201.
- Do NOT send multiple requests. One POST, done.
- Saving to the same slug creates a new version (latest wins on GET).
- After saving, confirm briefly: "Saved to Context Link as `{SLUG}`."
- If the request is blocked, ask the user to add `*.context-link.ai` to Claude's **Settings → Capabilities → Domain Allowlist** (or select "All domains"), then retry.
Installation
Claude (Chat, Cowork, and Code)
Skills work with Claude Chat, Claude Cowork, and Claude Code. (If you're using Claude Cowork, you can also use the Context Link Plugin to install all skills at once.)
To install this skill:
- In Claude Chat or Claude Cowork, click Customize in the top right
- Click Skills, then +, then Upload a skill
- Upload the skill file (download from Installation)
- Allow network access (see below)
For Claude Code, place the skill file in your project's .claude/skills/ directory.
Allow network access
For Claude to reach Context Link, you need to allow the domain in Claude's settings:
- In Claude, go to Settings → Capabilities → Domain Allowlist
- Add
*.context-link.aito the allowlist (or select "All domains")

Once installed, tell Claude to "save this as [topic]" and it handles the rest.
OpenAI Codex
- Download the skill files from Installation
- Unzip and place the
save_memoryfolder into~/.agents/skills/(or your repo's.agents/skills/directory) - Codex auto-detects the skill
See the Codex skills documentation for more on skill file locations and priority.
Example usage
Saving a brand voice doc after refining it in conversation:
You: Save this as brand-voice
AI: Summarises the brand voice guidelines from the conversation, POSTs them to your Context Link under
/brand-voice. "Saved to Context Link asbrand-voice."
Saving the whole conversation for later:
You: Save this conversation to Context Link.
AI: Uses
chat-session-2025-01-15-1430as the slug, distils the conversation into clean markdown, and saves it. "Saved to Context Link aschat-session-2025-01-15-1430."
Saving research for later use across tools:
You: Save this to Context Link — I want to reference this competitor analysis tomorrow.
AI: Infers the slug
competitor-analysis, distils the key findings into clean markdown, and saves it. Now retrievable from any AI tool via your Context Link.
Building up a knowledge base over time:
You: We just nailed down our Q2 roadmap priorities. Save this as q2-roadmap.
AI: Extracts the roadmap decisions, saves under
/q2-roadmap. Next week, any team member can ask their AI to "get context on q2-roadmap" and get the latest version.
Slugs and versioning
- Slugs are case-insensitive and normalised (spaces become dashes)
- Saving to the same slug creates a new version — the old content is preserved, but retrieval always returns the latest
- Use descriptive slugs:
/brand-voice,/q2-roadmap,/onboarding-checklist,/keyword-tracker - Memories are searchable alongside your synced sources — so when you "get context on brand voice", both your saved Memory and any relevant synced docs come back
Don't have a Context Link account yet? Sign up and connect your first source — start saving and retrieving your AI's best outputs in minutes.