Skip to main content

default

chainalign-backend


chainalign-backend / services/AIGateway / default

Variable: default

default: object

Defined in: services/AIGateway.js:195

Type Declaration

callLLM()

callLLM: (options) => Promise<any>

The central gateway for all external Large Language Model (LLM) calls within ChainAlign. This function orchestrates prompt sanitization, interaction with the chosen LLM, and comprehensive logging for auditability and cost tracking.

Parameters

options

Configuration options for the LLM call.

jsonResponse?

boolean = false

If true, attempts to parse the LLM's response as JSON. Otherwise, returns raw text.

llmModel?

string = 'gemini-pro'

The specific LLM model to use (e.g., 'gemini-pro', 'gemini-flash-latest', 'text-embedding-004'). Defaults to 'gemini-pro'.

prompt

string

The original, unredacted prompt text to be sent to the LLM.

queryContext

string

A descriptive string indicating the purpose/context of the LLM query (e.g., 'scenario_generation', 'chart_recommendation').

tenantId

string

The unique identifier of the tenant initiating the request.

user

any

An object containing user details (e.g., { id, email, role }) for logging.

Returns

Promise<any>

A promise that resolves to the LLM's response. The type depends on jsonResponse and llmModel (e.g., number[] for embeddings).

Throws

Throws an error if redaction fails or the LLM call encounters an issue.