Interface for performing AI operations using LaunchDarkly.

Hierarchy

  • LDAIClient

Methods

  • Retrieves and processes a single AI Config agent based on the provided key, LaunchDarkly context, and variables. This includes the model configuration and the customized instructions.

    Parameters

    • key: string

      The key of the AI Config agent.

    • context: LDContext

      The LaunchDarkly context object that contains relevant information about the current environment, user, or session. This context may influence how the configuration is processed or personalized.

    • defaultValue: LDAIAgentDefaults

      A fallback value containing model configuration and instructions.

    • Optional variables: Record<string, unknown>

      A map of key-value pairs representing dynamic variables to be injected into the instructions. The keys correspond to placeholders within the template, and the values are the corresponding replacements.

      Optional

    Returns Promise<LDAIAgent>

    An AI agent with customized instructions and a tracker. If the configuration cannot be accessed from LaunchDarkly, then the return value will include information from the defaultValue. The returned tracker can be used to track AI operation metrics (latency, token usage, etc.).

    Example

    const key = "research_agent";
    const context = {...};
    const variables = { topic: 'climate change' };
    const agent = await client.agent(key, context, {
    enabled: true,
    instructions: 'You are a research assistant.',
    }, variables);

    const researchResult = agent.instructions; // Interpolated instructions
    agent.tracker.trackSuccess();
  • Retrieves and processes multiple AI Config agents based on the provided agent configurations and LaunchDarkly context. This includes the model configuration and the customized instructions.

    Type Parameters

    Parameters

    • agentConfigs: T

      An array of agent configurations, each containing the agent key, default configuration, and variables for instructions interpolation.

    • context: LDContext

      The LaunchDarkly context object that contains relevant information about the current environment, user, or session. This context may influence how the configuration is processed or personalized.

    Returns Promise<Record<T[number]["key"], LDAIAgent>>

    A map of agent keys to their respective AI agents with customized instructions and tracker. If a configuration cannot be accessed from LaunchDarkly, then the return value will include information from the respective defaultValue. The returned tracker can be used to track AI operation metrics (latency, token usage, etc.).

    Example

    const agentConfigs = [
    {
    key: 'research_agent',
    defaultValue: { enabled: true, instructions: 'You are a research assistant.' },
    variables: { topic: 'climate change' }
    },
    {
    key: 'writing_agent',
    defaultValue: { enabled: true, instructions: 'You are a writing assistant.' },
    variables: { style: 'academic' }
    }
    ] as const;
    const context = {...};

    const agents = await client.agents(agentConfigs, context);
    const researchResult = agents["research_agent"].instructions; // Interpolated instructions
    agents["research_agent"].tracker.trackSuccess();
  • Retrieves and processes an AI Config based on the provided key, LaunchDarkly context, and variables. This includes the model configuration and the customized messages.

    Parameters

    • key: string

      The key of the AI Config.

    • context: LDContext

      The LaunchDarkly context object that contains relevant information about the current environment, user, or session. This context may influence how the configuration is processed or personalized.

    • defaultValue: LDAIDefaults

      A fallback value containing model configuration and messages. This will be used if the configuration is not available from LaunchDarkly.

    • Optional variables: Record<string, unknown>

      A map of key-value pairs representing dynamic variables to be injected into the message content. The keys correspond to placeholders within the template, and the values are the corresponding replacements.

      Optional

    Returns Promise<LDAIConfig>

    The AI config, customized messages, and a tracker. If the configuration cannot be accessed from LaunchDarkly, then the return value will include information from the defaultValue. The returned tracker can be used to track AI operation metrics (latency, token usage, etc.).

    Example

    const key = "welcome_prompt";
    const context = {...};
    const variables = {username: 'john'};
    const defaultValue = {
    enabled: false,
    };

    const result = config(key, context, defaultValue, variables);
    // Output:
    {
    enabled: true,
    config: {
    modelId: "gpt-4o",
    temperature: 0.2,
    maxTokens: 4096,
    userDefinedKey: "myValue",
    },
    messages: [
    {
    role: "system",
    content: "You are an amazing GPT."
    },
    {
    role: "user",
    content: "Explain how you're an amazing GPT."
    }
    ],
    tracker: ...
    }
  • Initializes and returns a new TrackedChat instance for chat interactions. This method serves as the primary entry point for creating TrackedChat instances from configuration.

    Parameters

    • key: string

      The key identifying the AI chat configuration to use.

    • context: LDContext

      The standard LDContext used when evaluating flags.

    • defaultValue: LDAIDefaults

      A default value representing a standard AI chat config result.

    • Optional variables: Record<string, unknown>

      Dictionary of values for instruction interpolation.

      Optional
    • Optional defaultAiProvider: "openai" | "langchain" | "vercel"
      Optional

    Returns Promise<undefined | TrackedChat>

    A promise that resolves to the TrackedChat instance, or null if the configuration is disabled.

    Example

    const key = "customer_support_chat";
    const context = {...};
    const defaultValue = {
    config: {
    enabled: false,
    model: { name: "gpt-4" },
    messages: [
    { role: "system", content: "You are a helpful customer support agent." }
    ]
    }
    };
    const variables = { customerName: 'John' };

    const chat = await client.initChat(key, context, defaultValue, variables);
    if (chat) {
    const response = await chat.invoke("I need help with my order");
    console.log(response.message.content);

    // Access configuration and tracker if needed
    console.log('Model:', chat.getConfig().model?.name);
    chat.getTracker().trackSuccess();
    }

Generated using TypeDoc