VS Code Copilot Chat can export traces, metrics, and events over OpenTelemetry (OTel). Because LangSmith ingests OTLP directly, you can point Copilot Chat at LangSmith and inspect agent turns, model metadata, tool calls, and token usage alongside the rest of your LLM traces. This guide is based on Copilot’s Monitor agent usage with OpenTelemetry reference.Documentation Index
Fetch the complete documentation index at: https://langchain-5e9cc07a-preview-opensw-1778791721-ab646b9.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
Before setting up tracing, ensure you have:- A recent version of Visual Studio Code with GitHub Copilot Chat installed and signed in.
- A LangSmith API key.
Configure tracing
Copilot Chat enables OTel emission when any ofCOPILOT_OTEL_ENABLED, OTEL_EXPORTER_OTLP_ENDPOINT, or the github.copilot.chat.otel.enabled setting is set. The simplest way to send Copilot Chat traces to LangSmith is to export the following environment variables before launching VS Code:
| Variable | Description |
|---|---|
COPILOT_OTEL_ENABLED | Set to true to enable Copilot Chat OTel export. |
COPILOT_OTEL_PROTOCOL | OTLP protocol. Use http to target LangSmith’s HTTP OTLP ingestion endpoint. |
COPILOT_OTEL_ENDPOINT | LangSmith OTLP endpoint. Takes precedence over OTEL_EXPORTER_OTLP_ENDPOINT. |
COPILOT_OTEL_CAPTURE_CONTENT | Capture full prompts, responses, tool arguments, and tool results on spans. Off by default. |
OTEL_EXPORTER_OTLP_HEADERS | Authentication headers for the OTLP exporter. Use x-api-key=<your_langsmith_api_key> and optionally Langsmith-Project=<project> to route traces to a specific LangSmith project. |
~/.zshrc, ~/.bashrc, or a shell profile) before starting the editor.
Update the LangSmith endpoint for self-hosted installations or regional SaaS: GCP EU uses
eu.api.smith.langchain.com; GCP APAC uses apac.api.smith.langchain.com; AWS US uses aws.api.smith.langchain.com. For self-hosted LangSmith, append /api/v1/otel to your LangSmith API URL—for example, https://ai-company.com/api/v1/otel.Alternative: VS Code settings
If you prefer not to set environment variables, you can enable OTel from VS Code settings instead. Open Settings (⌘, / Ctrl+,), search for copilot otel, and configure:
OTEL_EXPORTER_OTLP_HEADERS environment variable—VS Code settings do not expose a header field. Environment variables also take precedence over VS Code settings when both are set.
View traces in LangSmith
Start a Copilot Chat session and send a request. Open your LangSmith project to view the resulting traces. Each agent interaction produces a hierarchical span tree following the OTel GenAI Semantic Conventions:invoke_agentspans wrap the full agent orchestration, including agent name, conversation ID, turn count, and total token usage.chatspans capture individual LLM API calls with model, token counts, response time, and finish reason.execute_toolspans capture tool invocations with tool name, type, duration, and success status.
invoke_agent span appears as a child of the parent’s execute_tool span in LangSmith.
Add custom resource attributes
UseOTEL_RESOURCE_ATTRIBUTES to tag every trace with organizational metadata such as team or environment:
Troubleshooting
- No traces appear in LangSmith. Confirm
COPILOT_OTEL_ENABLED=trueand that VS Code was launched from the shell where the variables are exported. Restart VS Code after changing environment variables. - 401 / 403 errors. Verify
OTEL_EXPORTER_OTLP_HEADERSincludesx-api-key=<your_langsmith_api_key>and that the API key belongs to the workspace you want to trace into. - Traces land in the wrong project. Set
Langsmith-Project=<your_project_name>inOTEL_EXPORTER_OTLP_HEADERS. If unset, traces go to the workspace’sdefaultproject. - Prompts and responses are missing. Content capture is opt-in. Set
COPILOT_OTEL_CAPTURE_CONTENT=true(or enable thegithub.copilot.chat.otel.captureContentsetting).
Related resources
- VS Code Copilot: Monitor agent usage with OpenTelemetry
- Trace with OpenTelemetry
- Log traces to a project
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

