Tools or tool calling broadly refers to the ability of an LLM to execute code to retrieve useful information.
OpenAI calls it function calling. We’ll take their implementation as a reference. Anthropic introduced the MCP which is a more encompassing approach to allow LLM to use tools and more.
What’s a tool
Here’s the schema for defining a tool using OpenAI.
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogotá, Colombia"
}
},
"required": [
"location"
],
"additionalProperties": False
},
"strict": True
}
}
It is a JSON object that describes a function. The function could be implemented in any language. In this case, it would be equivalent to this Python code:
def get_weather(location: str):
"""Get current temperature for a given location
:param location: City and country e.g., `Bogotá, Colombia`
:return: Temperature at location
"""
# code that gets the weather
weather = ...
return weather
NOTE
The value for
weather
must be converted to a string to send it back to the LLM. Annotating the return type of the function could help nonetheless.
Model-agnostic tools
Choosing a model for tool calling
Practical tips
Read Anthropic’s guide
Anthropic gives practical advices on how to define tools (that I won’t repeat here). The guide includes practical examples and describe how to write your application to handle LLM tool requests.
Don’t use JSON
The tool is converted to text before being passed to the LLM. LLMs are good at reading and writing JSON, but the characters are "
, {
, }
, [
, ]
are expensive to tokenize and can become costly (read more)
BAML is a templating language that attacks this problem by defining tools like JavaScript type definitions, something LLMs are also very familiar with. This can dramatically cut token usage. More importantly, reducing the length of tool definitions could allow LLMs to handle more tools with better accuracy.
Dynamically instantiate tools
There’s a tradeoff between defining flexible tools and tools that are easy to use by LLMs. For example, the earlier get_weather()
tool can
Earlier on this page, we had get_weather()
which took a location
argument.
WARNING
Dynamic instantiation means the tool definition is variable. This affects the token count and can affect how your application behaves.