Namespace: Llm
index.Llm
Interfaces
- CallToolResponse
- ChatLlm
- LlmAnswerQuestionParams
- LlmCallToolParams
- MakeLangchainChatLlmProps
- MakeOpenAiChatLlmParams
- Tool
- ToolCallParams
Type Aliases
OpenAIChatCompletionWithoutUsage
Ƭ OpenAIChatCompletionWithoutUsage: Omit
<OpenAI.ChatCompletion
, "usage"
>
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:79
OpenAiAwaitedResponse
Ƭ OpenAiAwaitedResponse: Partial
<OpenAI.ChatCompletionMessage
> & Pick
<OpenAI.ChatCompletionMessage
, "content"
| "role"
>
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:87
OpenAiChatMessage
Ƭ OpenAiChatMessage: OpenAI.ChatCompletionMessageParam
& { content
: string
| null
; embedding?
: number
[] ; role
: OpenAiMessageRole
}
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:15
OpenAiMessageRole
Ƭ OpenAiMessageRole: "system"
| "assistant"
| "user"
| "function"
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:13
OpenAiStreamingResponse
Ƭ OpenAiStreamingResponse: AsyncIterable
<Omit
<OpenAI.ChatCompletionChunk
, "model"
| "object"
>>
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:84
SystemPrompt
Ƭ SystemPrompt: OpenAiChatMessage
& { content
: string
; role
: "system"
}
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:28
ToolCallDirective
Ƭ ToolCallDirective: OpenAI.ChatCompletionFunctionCallOption
Defined in
packages/mongodb-rag-core/src/llm/ChatLlm.ts:109
Functions
makeLangchainChatLlm
▸ makeLangchainChatLlm(«destructured»
): ChatLlm
Use any Langchain JS ChatModel
to talk to an LLM.
Note: This ChatLLM does not currently support tool calling.
Parameters
Name | Type |
---|---|
«destructured» | MakeLangchainChatLlmProps |
Returns
Defined in
packages/mongodb-rag-core/src/llm/LangchainChatLlm.ts:23
makeOpenAiChatLlm
▸ makeOpenAiChatLlm(«destructured»
): Required
<ChatLlm
>
Construct the ChatLlm service using the OpenAI client.
Parameters
Name | Type |
---|---|
«destructured» | MakeOpenAiChatLlmParams |
Returns
Required
<ChatLlm
>