Skip to content

MCP sampling/createMessage response includes system prompt text in content #2467

@will-j-wright

Description

@will-j-wright

Describe the bug

When an MCP server sends a sampling/createMessage request, the Copilot CLI correctly routes it to the LLM and returns a response. However, the response content prepends the entire system prompt text before the LLM's actual answer.

Affected version

GitHub Copilot CLI 1.0.16-0

Steps to reproduce the behavior

  1. Configure an MCP server that uses sampling/createMessage with a system prompt
  2. Have the server send a sampling request where it expects a specific response format
  3. Approve the sampling request when prompted
  4. Observe the response content includes the system prompt prepended to the LLM's answer

Expected behavior

The CreateMessageResult.Content should contain only the LLM's response text, e.g.:

=== FALSE

Additional context

Actual behavior

The response content contains the system prompt echoed back, followed by the LLM's answer:

<entire system prompt text>... === FALSE

Note

This works correctly in VS Code's MCP sampling implementation — the system prompt is not included in the response content.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No fields configured for Bug.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions