Optional logger: LDLoggerOptional Private _clientPrivate _modelPrivate _parametersProtected Optional Readonly loggerStatic Private _ensureAutomatically patches the ESM openai module for OpenTelemetry tracing when a TracerProvider is active and @traceloop/instrumentation-openai is installed.
OpenTelemetry instrumentations auto-patch CJS require() calls, but this provider loads openai via ESM import, which bypasses those hooks. This method bridges that gap by calling manuallyInstrument() on the ESM module.
Optional logger: LDLoggerOptional Static createStatic factory method to create an OpenAI AIProvider from an AI configuration.
Optional logger: LDLoggerOptional Static createAIMetricsCreate AI metrics information from an OpenAI response. This method extracts token usage information and success status from OpenAI responses and returns a LaunchDarkly AIMetrics object.
The response from OpenAI chat completions API
LDAIMetrics with success status and token usage
Use getAIMetricsFromResponse() instead.
Static getAIMetricsGet AI metrics from an OpenAI response. This method extracts token usage information and success status from OpenAI responses and returns a LaunchDarkly AIMetrics object.
The response from OpenAI chat completions API
LDAIMetrics with success status and token usage
const tracker = aiConfig.createTracker();
const response = await tracker.trackMetricsOf(
OpenAIProvider.getAIMetricsFromResponse,
() => client.chat.completions.create(config)
);
Generated using TypeDoc
OpenAI implementation of AIProvider. This provider integrates OpenAI's chat completions API with LaunchDarkly's tracking capabilities.