ASK QUESTION
Direct answers from your Context Link, one paragraph, cited
A Claude skill that asks your Context Link a direct question and returns a single concise paragraph with numbered citations, grounded only in your connected sources.
When does this skill activate?
Claude will use this skill when you mention phrases like:
How it works
Ask a direct question
Invoke the skill (/ask-question [question]) or use natural language ("ask Context Link what our refund policy is").
Semantic retrieval + grounded synthesis
Context Link runs the same vector search as Get Context, then passes the matching snippets to a small fast LLM that composes one concise paragraph using only that material.
Cited answer returned
Claude receives one paragraph with inline [1] / [2] citation markers plus a sources list. Every claim traces back to a snippet in your own content.
Requirements
Add This Skill
Copy each field into Claude's skill editor to add this skill, or add the plugin marketplace to get all skills at once.
ask-question
---
name: ask-question
description: >
Ask a question against the user's Context Link and receive a concise, cited
answer drawn from their connected sources. Use when the user says "ask my
context", "what does my knowledge base say about…", "answer this from my
docs", or asks a direct question whose answer should be grounded in the
user's own content. Pro-plan feature.
version: 0.1.0
---
**What is Context Link?** Context Link is an external service that indexes connected sources (websites, Google Drive, Notion, files, email) and memories into a searchable knowledge base. The ask-question endpoint returns a single concise answer (≤ 1 paragraph) with citations, grounded in the user's own content. If you don't know the user's Context Link URL, ask them for it.
> **Pro plan:** the ask-question endpoint is a Pro-plan feature. A `402` response means the user needs to upgrade.
---
## Ask a Question via Context Link
Ask a direct question and receive a short, cited answer from the user's Context Link.
**When to use:** the user wants a direct answer grounded in their own knowledge base, e.g. "ask my docs…", "ask context link what …", "use context link to answer…", or invokes `/ask-question` in Claude Code. Prefer this over `get-context` when the user wants the answer itself rather than raw source material.
**Workflow:**
1. **Print this message:** `🔗 Asking Context Link about {TOPIC}`. Never print the full Context Link URL, as it contains a private pin.
2. **Form the question slug.** Take the user's question, lowercase, replace spaces with dashes.
3. **Send a GET request.** Take the URL below and replace the path placeholder with `q/{QUESTION_SLUG}`.
```bash
curl -s "~~context link url~~"
```
- Optionally append `&mode=MODE_NAME` (or `?mode=MODE_NAME` if no pin is set) to weight results toward a specific mode configured by the user.
4. **Handle the response.** The response is JSON:
```json
{
"result": "ok",
"answer": "Concise paragraph with inline [1] citation markers.",
"citations": [{ "n": 1, "title": "Refund policy", "url": "https://…" }],
"format": "markdown"
}
```
- Present the `answer` field to the user verbatim (or lightly reformatted). **Keep the inline `[1]`, `[2]` citation markers.**
- After the answer, list the `citations` as a small "Sources" footer (title + url).
- Never fabricate citations. Only surface those returned by the endpoint.
5. **Handle the other result states:**
- `result: "no_context"`: tell the user their Context Link had no relevant material and ask whether to broaden the question.
- `result: "llm_unavailable"` (HTTP 503): the upstream LLM is momentarily down; retry once, then stop and tell the user.
6. **If the request fails or is blocked**, report the failure explicitly. Do not silently skip.
**Rules:**
- One GET request per question. Do not retry unless the user asks.
- Do not supplement with outside information unless the user explicitly asks. The value is the grounded, cited answer.
- If the response is `402`, tell the user the ask-question feature requires the Pro plan and link them to `www.context-link.ai`.
- If the response is `429` with an "allowance" message, tell the user they have exceeded their monthly Context Link question allowance.
- If the request is blocked, ask the user to add `*.context-link.ai` to Claude's **Settings → Capabilities → Domain Allowlist** (or select "All domains"), then retry.
How to use this skill
Related Skills
/get-context
Retrieve internal knowledge via Context Link when the user references company knowledge or says "...
/save-memory
Save conversation content or specified text to Context Link for later retrieval. Use when the use...
/update-memory
Retrieve saved content from Context Link, update it with current context, and save it back. Use w...
Frequently Asked Questions
How is Ask Question different from Get Context?
Get Context returns the raw relevant snippets from your sources, ideal when the AI will reason over the material itself. Ask Question runs the same retrieval, then passes those snippets to a small fast LLM that composes one concise paragraph with numbered citations, ideal when you want a direct answer, not source chunks. Use Get Context for generative tasks. Use Ask Question when you want the answer itself.
Will the answer be made up?
No. The LLM is instructed to use only the numbered context blocks from your sources. Every [1], [2] citation marker in the answer resolves to a real snippet in the citations array, with the title and URL of the source. Claims that aren't supported by your connected sources are not generated. If retrieval returns nothing relevant, the skill returns result: "no_context" and says so explicitly, rather than fabricating.
Why is this a Pro feature?
Ask Question makes a real LLM call per request, so it incurs per-query cost. Pro plans include a monthly allowance of 1,000 LLM requests, shared across all LLM-powered features (including Get Context's reranking). Starter plans see a 402 Payment Required response with an upgrade prompt.
What happens if the upstream LLM fails?
The skill returns result: "llm_unavailable" with HTTP 503. This is distinct from no_context (nothing relevant found), so Claude knows to retry the request rather than telling the user "nothing was found." Retry once; if it fails again, surface the failure to the user.
Does it work in Claude.ai chat?
Yes. This skill works in Claude.ai, ChatGPT, Gemini, Copilot, and any AI tool that can make HTTP requests. It calls a URL on your Context Link subdomain, no code execution required.
Where is the full technical reference?
See the Question Endpoint API docs for the complete request/response schema, status codes, caching behaviour, and cross-language examples (Python, JavaScript, Ruby).
Start asking grounded questions
Upgrade to Pro and invoke the Ask Question skill from inside Claude, ChatGPT, Gemini, or Copilot.
Upgrade to Pro