πŸ”Prompt Capture

Prompt Capture allows you to view the complete context of your AI interactions, including system prompts, input messages, and AI-generated responses. This feature is essential for debugging, auditing, and optimizing your AI workflows.

## What is Prompt Capture?

Prompt Capture stores and displays the full content of AI completions:

  • System Prompt: The initial instructions given to the AI model

  • Input Messages: User messages and conversation history sent to the model

  • Output Response: The AI-generated response

This visibility helps you:

  • Debug unexpected AI behavior

  • Audit AI interactions for compliance

  • Optimize prompts for better results

  • Understand token usage and costs

How to Enable Prompt Capture

Step 1: Enable in Team Settings

circle-info

Required Role: Only users with the Tenant Administrator role can enable or disable Prompt Capture.

  1. Navigate to Management β†’ Teams

  2. Select your team

  3. Find the AI Prompt Capture section

  4. Toggle Enable Prompt Capture to ON

  5. Click Save Changes

Step 2: Configure Your SDK

Prompt capture must also be enabled in your SDK integration. Here's how to enable it for each SDK:

circle-exclamation

Viewing Prompt Data

Once enabled, you can view prompt data in two places:

AI Transaction Log

  1. Go to Logs β†’ AI Transaction Log

  2. Click the expand icon (↔) next to the delete button to open the transaction details modal

  3. Look for the Prompt Data section

  4. Click View Prompt Data to open the full viewer

Traces Page

  1. Go to Traces

  2. Select a trace from the table

  3. Click on a transaction in the Transaction Table

  4. In the drawer, click View Prompt Data

Understanding the Prompt Viewer

The Prompt Viewer modal displays your AI interaction in three tabs:

Tab
Description

System Prompt

The initial instructions that define the AI's behavior and context

Input Messages

The conversation history including user messages and any previous assistant responses

Output Response

The AI-generated response for this completion

Context Summary

At the top of the viewer, you'll see key metrics:

  • Model: The AI model used (e.g., claude-3-5-sonnet, gpt-4o)

  • Provider: The AI provider (e.g., Anthropic, OpenAI)

  • Cost: Total cost of this completion

  • Tokens: Input, output, and cached token counts

  • Duration: Request processing time

Prompt Truncation

To manage storage efficiently, prompts may be truncated if they exceed the configured limit.

How to Identify Truncated Data

  • A warning banner appears at the top of the Prompt Viewer

  • The promptsTruncated field is set to true in the API response

Truncation Limits

Prompts exceeding 50,000 characters are automatically truncated to manage storage efficiently. This is a system-wide limit.

Privacy and Security

circle-info

Data Security: All prompt data is encrypted at rest and in transit. Prompts are stored securely and only accessible to authorized team members.

Troubleshooting

Prompts Not Appearing?

  1. Check Team Settings: Ensure Prompt Capture is enabled

  2. Check SDK Config: Verify the capture prompts setting is enabled in your SDK:

    • Python: capture_prompts=True

    • Node.js: capturePrompts: true

    • Go/MCP: REVENIUM_CAPTURE_PROMPTS=true environment variable

  3. Check hasPromptData: The transaction must have hasPromptData: true

  4. Recent Transactions: Only transactions after enabling capture will have data

"No Prompt Data Available" Message

This appears when:

  • Prompt capture was not enabled when the transaction occurred

  • The SDK did not send prompt data

  • The transaction is from before prompt capture was enabled

Last updated

Was this helpful?