interlocute.ai beta

Tool Use & Function Calling

Give your AI nodes the ability to take action — search the web, query databases, call APIs, and process data — all through governed, auditable function calling.

What is tool use?

Tool use (also called function calling) allows an AI node to invoke external capabilities during a conversation. Instead of only generating text, the node can search the web, look up data, perform calculations, or call your own APIs — then incorporate the results into its response.

Why it matters

Without tools, an LLM is limited to what it knows from training data. Tools bridge the gap between language understanding and real-world action. A node with tool access can answer questions about live data, automate workflows, and interact with your existing systems.

How Interlocute helps

Interlocute provides pre-configured tools out of the box — web search, data processing, and more. You can also define custom tool schemas that map to your own API endpoints. Every tool invocation is governed by execution policies: you control which tools a node can use, how often, and with what parameters.

Governed execution

Every tool call is logged, metered, and subject to your node's governance policies. You can set rate limits, restrict tool access by API key scope, and audit every invocation. This makes tool use safe for production deployments where accountability matters.

Frequently Asked Questions

Tool Use & Function Calling

What is tool use in AI and how does Interlocute support it?
Tool use, also known as function calling, lets an AI node invoke external tools or APIs during a conversation. Interlocute supports tool use natively — you configure which tools a node can access, and the runtime handles invocation, result injection, and logging automatically.
What pre-configured tools does Interlocute provide?
Interlocute includes pre-configured tools for web search and data processing. You can also define custom tool schemas that point to your own API endpoints. Tools are registered per node, so different nodes can have different capabilities.
How do I add a custom tool to my node?
You define a tool schema (name, description, parameters) and associate it with an API endpoint. When the LLM decides to call the tool, Interlocute invokes your endpoint with the specified parameters and returns the result to the LLM for incorporation into the response.
Is tool use governed and auditable?
Yes. Every tool invocation is logged with the tool name, parameters, result, and latency. You can set execution policies to restrict which tools a node can use, enforce rate limits, and require specific API key scopes for tool access.
Can I restrict which tools a node can use?
Yes. Tool access is configured per node and can be scoped by API key. You can allow all tools, restrict to a specific set, or disable tool use entirely. Changes take effect immediately without redeployment.
How does tool use interact with streaming responses?
Tool calls are executed inline during response generation. If the LLM decides to call a tool mid-response, the tool is invoked, and the result is incorporated into the streamed output. The client sees a seamless response without needing to handle tool orchestration.
How is tool use billed?
Tool invocations are metered as computation tokens. The cost of the tool call itself (e.g., a web search) is included in your node's usage. External API costs for custom tools are your responsibility, but Interlocute tracks and logs every invocation for visibility.

Ready to build with Tool Use & Function Calling?

Deploy your node in seconds and start using Tool Use & Function Calling today.