Vercel AI implementation of AIProvider. This provider integrates Vercel AI SDK with LaunchDarkly's tracking capabilities.

Hierarchy

  • AIProvider
    • VercelProvider

Constructors

Properties

_model: LanguageModel
logger?: LDLogger

Methods

  • Invoke the Vercel AI model with structured output support.

    Parameters

    • messages: LDMessage[]
    • responseStructure: Record<string, unknown>

    Returns Promise<StructuredResponse>

  • Static factory method to create a Vercel AIProvider from an AI configuration. This method auto-detects the provider and creates the model. Note: Messages from the AI config are not included in the provider - messages should be passed at invocation time via invokeModel().

    Parameters

    • aiConfig: LDAIConfig

      The LaunchDarkly AI configuration

    • Optional logger: LDLogger

      Optional logger

      Optional

    Returns Promise<VercelProvider>

    A Promise that resolves to a configured VercelProvider

  • Create AI metrics information from a Vercel AI response. This method extracts token usage information and success status from Vercel AI responses and returns a LaunchDarkly AIMetrics object. Supports both v4 and v5 field names for backward compatibility.

    Parameters

    • vercelResponse: TextResponse

      The response from generateText() or similar non-streaming operations

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Deprecated

    Use getAIMetricsFromResponse() instead.

  • Create a Vercel AI model from an AI configuration. This method auto-detects the provider and creates the model instance.

    Parameters

    • aiConfig: LDAIConfig

      The LaunchDarkly AI configuration

    Returns Promise<LanguageModel>

    A Promise that resolves to a configured Vercel AI model

  • Get AI metrics from a Vercel AI SDK text response This method extracts token usage information and success status from Vercel AI responses and returns a LaunchDarkly AIMetrics object. Supports both v4 and v5 field names for backward compatibility.

    Parameters

    • response: TextResponse

      The response from generateText() or similar non-streaming operations

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Example

    const response = await aiConfig.tracker.trackMetricsOf(
    VercelProvider.getAIMetricsFromResponse,
    () => generateText(vercelConfig)
    );
  • Get AI metrics from a Vercel AI SDK streaming result.

    This method waits for the stream to complete, then extracts metrics using totalUsage (preferred for cumulative usage across all steps) or usage if totalUsage is unavailable.

    Parameters

    • stream: StreamResponse

      The stream result from streamText()

    Returns Promise<LDAIMetrics>

    A Promise that resolves to LDAIMetrics

    Example

    const stream = aiConfig.tracker.trackStreamMetricsOf(
    () => streamText(vercelConfig),
    VercelProvider.getAIMetricsFromStream
    );
  • Map LaunchDarkly model parameters to Vercel AI SDK parameters.

    Parameter mappings:

    • max_tokens → maxTokens
    • max_completion_tokens → maxOutputTokens
    • temperature → temperature
    • top_p → topP
    • top_k → topK
    • presence_penalty → presencePenalty
    • frequency_penalty → frequencyPenalty
    • stop → stopSequences
    • seed → seed

    Parameters

    • Optional parameters: {
          [index: string]: unknown;
      }

      The LaunchDarkly model parameters to map

      Optional
      • [index: string]: unknown

    Returns VercelAIModelParameters

    An object containing mapped Vercel AI SDK parameters

  • Map LaunchDarkly provider names to LangChain provider names. This method enables seamless integration between LaunchDarkly's standardized provider naming and LangChain's naming conventions.

    Parameters

    • ldProviderName: string

    Returns string

  • Map Vercel AI SDK usage data to LaunchDarkly token usage.

    Parameters

    • usageData: ModelUsageTokens

      Usage data from Vercel AI SDK

    Returns LDTokenUsage

    LDTokenUsage

  • Convert an AI configuration to Vercel AI SDK parameters. This static method allows converting an LDAIConfig to VercelAISDKConfig without requiring an instance of VercelProvider.

    Type Parameters

    • TMod

    Parameters

    Returns VercelAISDKConfig<TMod>

    A configuration directly usable in Vercel AI SDK generateText() and streamText()

    Throws

    if a Vercel AI SDK model cannot be determined from the given provider parameter

Generated using TypeDoc