-
Notifications
You must be signed in to change notification settings - Fork 1.3k
MCP sampling/createMessage response includes system prompt text in content #2467
Copy link
Copy link
Open
Labels
Description
Describe the bug
When an MCP server sends a sampling/createMessage request, the Copilot CLI correctly routes it to the LLM and returns a response. However, the response content prepends the entire system prompt text before the LLM's actual answer.
Affected version
GitHub Copilot CLI 1.0.16-0
Steps to reproduce the behavior
- Configure an MCP server that uses
sampling/createMessagewith a system prompt - Have the server send a sampling request where it expects a specific response format
- Approve the sampling request when prompted
- Observe the response content includes the system prompt prepended to the LLM's answer
Expected behavior
The CreateMessageResult.Content should contain only the LLM's response text, e.g.:
=== FALSE
Additional context
Actual behavior
The response content contains the system prompt echoed back, followed by the LLM's answer:
<entire system prompt text>... === FALSE
Note
This works correctly in VS Code's MCP sampling implementation — the system prompt is not included in the response content.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Type
Fields
Give feedbackNo fields configured for Bug.