OpenAI implementation of AIProvider. This provider integrates OpenAI's chat completions API with LaunchDarkly's tracking capabilities.

Hierarchy

  • AIProvider
    • OpenAIProvider

Constructors

Properties

_client: OpenAI
_modelName: string
_parameters: Record<string, unknown>
logger?: LDLogger

Methods

  • Invoke the OpenAI model with structured output support.

    Parameters

    • messages: LDMessage[]
    • responseStructure: Record<string, unknown>

    Returns Promise<StructuredResponse>

  • Automatically patches the ESM openai module for OpenTelemetry tracing when a TracerProvider is active and @traceloop/instrumentation-openai is installed.

    OpenTelemetry instrumentations auto-patch CJS require() calls, but this provider loads openai via ESM import, which bypasses those hooks. This method bridges that gap by calling manuallyInstrument() on the ESM module.

    Parameters

    • Optional logger: LDLogger
      Optional

    Returns Promise<void>

  • Create AI metrics information from an OpenAI response. This method extracts token usage information and success status from OpenAI responses and returns a LaunchDarkly AIMetrics object.

    Parameters

    • openaiResponse: any

      The response from OpenAI chat completions API

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Deprecated

    Use getAIMetricsFromResponse() instead.

  • Get AI metrics from an OpenAI response. This method extracts token usage information and success status from OpenAI responses and returns a LaunchDarkly AIMetrics object.

    Parameters

    • response: any

      The response from OpenAI chat completions API

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Example

    const tracker = aiConfig.createTracker();
    const response = await tracker.trackMetricsOf(
    OpenAIProvider.getAIMetricsFromResponse,
    () => client.chat.completions.create(config)
    );

Generated using TypeDoc