Chain that combines documents by stuffing into context.

Hierarchy (view full)

Implements

Constructors

Properties

documentVariableName: string = "context"

Variable name in the LLM chain to put the documents in

inputKey: string = "input_documents"
llmChain: LLMChain<string, any>

LLM Wrapper to use after formatting documents

memory?: any

Accessors

Methods

  • Parameters

    • inputs: ChainValues[]
    • Optionalconfig: any[]

    Returns Promise<ChainValues[]>

    Use .batch() instead. Will be removed in 0.2.0.

    Call the chain on all inputs in the list

  • Parameters

    • values: any
    • Optionalconfig: any
    • Optionaltags: string[]

    Returns Promise<ChainValues>

    Use .invoke() instead. Will be removed in 0.2.0.

    Run the core logic of this chain and add to output if desired.

    Wraps _call and handles memory.

  • Invoke the chain with the provided input and returns the output.

    Parameters

    • input: ChainValues

      Input values for the chain run.

    • Optionaloptions: any

    Returns Promise<ChainValues>

    Promise that resolves with the output of the chain run.

  • Parameters

    • inputs: Record<string, unknown>
    • outputs: Record<string, unknown>
    • returnOnlyOutputs: boolean = false

    Returns Promise<Record<string, unknown>>

  • Parameters

    • input: any
    • Optionalconfig: any

    Returns Promise<string>

    Use .invoke() instead. Will be removed in 0.2.0.

""