LangChain implementation of AIProvider. This provider integrates LangChain models with LaunchDarkly's tracking capabilities.

Hierarchy

  • AIProvider
    • LangChainProvider

Constructors

Properties

_llm: BaseChatModel<BaseChatModelCallOptions, AIMessageChunk>
logger?: LDLogger

Methods

  • Invoke the LangChain model with structured output support.

    Parameters

    • messages: LDMessage[]
    • responseStructure: Record<string, unknown>

    Returns Promise<StructuredResponse>

  • Convert LaunchDarkly messages to LangChain messages. This helper method enables developers to work directly with LangChain message types while maintaining compatibility with LaunchDarkly's standardized message format.

    Parameters

    • messages: LDMessage[]

    Returns (HumanMessage | SystemMessage | AIMessage)[]

  • Create AI metrics information from a LangChain provider response. This method extracts token usage information and success status from LangChain responses and returns a LaunchDarkly AIMetrics object.

    Parameters

    • langChainResponse: AIMessage

      The response from the LangChain model

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Deprecated

    Use getAIMetricsFromResponse() instead.

  • Create a LangChain model from an AI configuration. This public helper method enables developers to initialize their own LangChain models using LaunchDarkly AI configurations.

    Parameters

    • aiConfig: LDAIConfig

      The LaunchDarkly AI configuration

    Returns Promise<BaseChatModel<BaseChatModelCallOptions, AIMessageChunk>>

    A Promise that resolves to a configured LangChain BaseChatModel

  • Get AI metrics from a LangChain provider response. This method extracts token usage information and success status from LangChain responses and returns a LaunchDarkly AIMetrics object.

    Parameters

    • response: AIMessage

      The response from the LangChain model

    Returns LDAIMetrics

    LDAIMetrics with success status and token usage

    Example

    // Use with tracker.trackMetricsOf for automatic tracking
    const response = await tracker.trackMetricsOf(
    LangChainProvider.getAIMetricsFromResponse,
    () => llm.invoke(messages)
    );
  • Map LaunchDarkly provider names to LangChain provider names. This method enables seamless integration between LaunchDarkly's standardized provider naming and LangChain's naming conventions.

    Parameters

    • ldProviderName: string

    Returns string

Generated using TypeDoc