Class that extends the BaseTracer class from the langchain.callbacks.tracers.base module. It represents a callback handler that logs the execution of runs and emits RunLog instances to a RunLogStream.

Hierarchy

Constructors

Properties

awaitHandlers: boolean = ...
ignoreAgent: boolean = false
ignoreChain: boolean = false
ignoreLLM: boolean = false
ignoreRetriever: boolean = false
name: string = "log_stream_tracer"
writer: WritableStreamDefaultWriter<any>
autoClose: boolean = true
runMap: Map<string, Run> = ...
transformStream: TransformStream<any, any>
excludeNames?: string[]
excludeTags?: string[]
excludeTypes?: string[]
includeNames?: string[]
includeTags?: string[]
includeTypes?: string[]
rootId?: string

Methods

  • Called at the end of a Chain run, with the outputs and the run ID.

    Parameters

    • outputs: ChainValues
    • runId: string
    • Optional _parentRunId: string
    • Optional _tags: string[]
    • Optional kwargs: {
          inputs?: Record<string, unknown>;
      }
      • Optional inputs?: Record<string, unknown>

    Returns Promise<Run>

  • Called if a Chain run encounters an error

    Parameters

    • error: unknown
    • runId: string
    • Optional _parentRunId: string
    • Optional _tags: string[]
    • Optional kwargs: {
          inputs?: Record<string, unknown>;
      }
      • Optional inputs?: Record<string, unknown>

    Returns Promise<Run>

  • Called at the start of a Chain run, with the chain name and inputs and the run ID.

    Parameters

    • chain: Serialized
    • inputs: ChainValues
    • runId: string
    • Optional parentRunId: string
    • Optional tags: string[]
    • Optional metadata: KVMap
    • Optional runType: string
    • Optional name: string

    Returns Promise<Run>

  • Called at the start of a Chat Model run, with the prompt(s) and the run ID.

    Parameters

    • llm: Serialized
    • messages: BaseMessage[][]
    • runId: string
    • Optional parentRunId: string
    • Optional extraParams: KVMap
    • Optional tags: string[]
    • Optional metadata: KVMap
    • Optional name: string

    Returns Promise<Run>

  • Called when an LLM/ChatModel in streaming mode produces a new token

    Parameters

    • token: string
    • idx: NewTokenIndices

      idx.prompt is the index of the prompt that produced the token (if there are multiple prompts) idx.completion is the index of the completion that produced the token (if multiple completions per prompt are requested)

    • runId: string
    • Optional _parentRunId: string
    • Optional _tags: string[]
    • Optional fields: HandleLLMNewTokenCallbackFields

    Returns Promise<Run>

  • Called at the start of an LLM or Chat Model run, with the prompt(s) and the run ID.

    Parameters

    • llm: Serialized
    • prompts: string[]
    • runId: string
    • Optional parentRunId: string
    • Optional extraParams: KVMap
    • Optional tags: string[]
    • Optional metadata: KVMap
    • Optional name: string

    Returns Promise<Run>

  • Called at the start of a Tool run, with the tool name and input and the run ID.

    Parameters

    • tool: Serialized
    • input: string
    • runId: string
    • Optional parentRunId: string
    • Optional tags: string[]
    • Optional metadata: KVMap
    • Optional name: string

    Returns Promise<Run>

  • Type Parameters

    • T

    Parameters

    • runId: string
    • output: AsyncGenerator<T, any, unknown>

    Returns AsyncGenerator<T, any, unknown>

Generated using TypeDoc