Tool Calling
You can give your chatbot access to external tools. For example, you could add a tool interfaces with an API and use the data returned from the API in the chatbot.
This is a powerful feature that allows you to extend the capabilities of your chatbot beyond retrieval augmented generation and LLM conversations.
Tool Calling Flow
When a user sends a message to a chatbot with tool calling implemented, the following happens:
Define a Tool
To make a tool, define an object that implements the Tool
interface.
A tool must include a definition
property that is a function definition for the LLM to invoke. This must be JSON schema that corresponds to the OpenAI function definition format.
The tool responds LLM-generated arguments that correspond to this JSON schema.
You must also include the call()
method, which accepts the LLM-generated arguments
plus additional metadata and returns a tool call message, references,
and whether the user query should be rejected.
// weatherTool.ts
import { Tool } from "mongodb-chatbot-server";
export const weatherTool: Tool = {
definition: {
name: "getCurrentWeather",
description: "Get the weather in location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state e.g. San Francisco, CA",
},
unit: { type: "string", enum: ["c", "f"] },
},
required: ["location"],
},
},
async call({ functionArgs: { location, unit } }) {
const weatherReport = await fetchWeatherFromApi(location, unit);
return {
toolCallMessage: {
role: "function",
name: "getCurrentWeather",
content: `The weather in ${location} is: ${weatherReport.content}`,
},
references: [{ title: `${location} weather`, url: weatherReport.url }],
};
},
};
Give Tools to the ChatLLM
To give the chatbot access to the tool, you must provide the ChatLlm
with the tool.
The makeOpenAiChatLlm()
function, which you can use to instantiate a client for an OpenAI model, takes a tools
argument.
Provide an array of tools to the tools
argument.
// weatherChatLlm.ts
import { makeOpenAiChatLlm, OpenAiChatMessage } from "mongodb-chatbot-server";
import { weatherTool } from "./weatherTool";
export const openAiClient = new OpenAIClient(
OPENAI_ENDPOINT,
new AzureKeyCredential(OPENAI_API_KEY)
);
export const weatherChatLlm = makeOpenAiChatLlm({
openAiClient,
deployment: OPENAI_CHAT_COMPLETION_DEPLOYMENT,
openAiLmmConfigOptions: {
temperature: 0,
maxTokens: 500,
},
tools: [weatherTool],
});
Configure Server Tool Calling Behavior
To use a ChatLlm
with tool calling in your server, add it to your configuration
as you would any other ChatLlm
.
import { AppConfig } from "mongodb-chatbot-server";
import { weatherChatLlm } from "./weatherChatLlm";
const config: AppConfig = {
conversationsRouterConfig: {
llm: weatherChatLlm,
// ...other config
},
// ...other config
};