# For AI Agents

## Context7 (Recommended)

[Context7](https://context7.com) provides automatic documentation retrieval via MCP. Your AI agent pulls relevant Revenium docs on demand — works with any MCP-compatible tool (Claude Code, Cursor, Windsurf, Continue, Cline, GitHub Copilot).

Add the Context7 MCP server to your agent's configuration:

```json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp@latest"]
    }
  }
}
```

Refer to your tool's MCP documentation for the configuration file location.

### Revenium Libraries on Context7

**Documentation:**

| Library            | Content                                                   |
| ------------------ | --------------------------------------------------------- |
| **API Reference**  | Complete REST API reference with request/response schemas |
| **Knowledge Base** | Guides, quickstarts, and platform concepts                |

**SDKs and Tools:**

| Library             | Content                                                          |
| ------------------- | ---------------------------------------------------------------- |
| **Python SDK**      | Unified Python middleware for OpenAI, Anthropic, Google, LiteLLM |
| **Node Middleware** | Unified TypeScript middleware for all supported providers        |
| **MCP Server**      | Zero-code AI metering via Model Context Protocol                 |
| **Claude Code SDK** | Claude Code telemetry export and usage tracking                  |
| **CLI**             | Command-line interface for the Revenium platform                 |

When querying Context7, use `revenium` as the search term to find these libraries.

***

## llms.txt

For browser-based AI tools (ChatGPT, Claude.ai, Gemini) or when MCP is unavailable, paste this URL into your conversation:

[**https://revenium.readme.io/llms.txt**](https://revenium.readme.io/llms.txt)

***

## OpenAPI Specs

Full OpenAPI (OAS) documents are available for download at:

[**Revenium OpenAPI Spec Downloads**](https://revenium.readme.io/reference/llm-agent-friendly-api-specs-for-revenium)

{% hint style="warning" %}
These are large files intended for code generation tools and schema validation, not for pasting into AI chat. For AI agent access to Revenium APIs, use Context7 or llms.txt instead.
{% endhint %}

***

## Related

* [Integration Options for AI Metering](/integration-options-for-ai-metering.md) — supported SDKs and providers
* [OpenTelemetry Integration](/opentelemetry-integration.md) — OTLP endpoints and GenAI attribute mapping

### Metering API Reference

| Modality    | API Reference                                                                   |
| ----------- | ------------------------------------------------------------------------------- |
| Completions | [Meter AI Completion](https://revenium.readme.io/reference/meter_ai_completion) |
| Images      | [Meter AI Images](https://revenium.readme.io/reference/meter_ai_images)         |
| Video       | [Meter AI Video](https://revenium.readme.io/reference/meter_ai_video)           |
| Audio       | [Meter AI Audio](https://revenium.readme.io/reference/meter_ai_audio)           |


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.revenium.io/for-ai-agents.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
