Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ This integration works in the Node.js, Cloudflare Workers, Vercel Edge Functions

_Import name: `Sentry.openAIIntegration`_

The `openAIIntegration` adds instrumentation for the `openai` API to capture spans by automatically wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording.
The `openAIIntegration` adds instrumentation for the `openai` API to capture spans by wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording.

<PlatformSection notSupported={["javascript.cloudflare", "javascript.nextjs", "javascript"]}>
It is enabled by default and will automatically capture spans for OpenAI API method calls. You can opt-in to capture inputs and outputs by setting `recordInputs` and `recordOutputs` in the integration config:
Expand Down Expand Up @@ -77,6 +77,28 @@ const response = await client.chat.completions.create({

</PlatformSection>

<PlatformSection supported={["javascript.nextjs"]}>
For Next.js applications, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:

```javascript
import * as Sentry from "@sentry/nextjs";
import OpenAI from "openai";

const openai = new OpenAI();
const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
});

// Use the wrapped client instead of the original openai instance
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
```

</PlatformSection>

<PlatformSection supported={["javascript"]}>
For browser applications, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:

Expand Down Expand Up @@ -138,31 +160,6 @@ By default this integration adds tracing support to OpenAI API method calls incl

The integration will automatically detect streaming vs non-streaming requests and handle them appropriately.

<PlatformSection supported={['javascript.nextjs']}>

## Edge runtime

This integration is automatically instrumented in the Node.js runtime. For Next.js applications using the Edge runtime, you need to manually instrument the OpenAI client:

```javascript
import * as Sentry from "@sentry/nextjs";
import OpenAI from "openai";

const openai = new OpenAI();
const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
});

// Use the wrapped client instead of the original openai instance
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
```

</PlatformSection>

## Supported Versions

- `openai`: `>=4.0.0 <6`
- `openai`: `>=4.0.0 <7`
Loading