interlocute.ai beta
Out-of-the-box agent

API / Contract Node

Ship a structured AI endpoint your services can call directly. Strict contracts, minimal latency, and governed behaviour.

What you get out of the box

Strict JSON-in / JSON-out contract enforcement

Minimal conversational overhead — no greetings or commentary

Designed for service-to-service and webhook integrations

Low-latency responses with predictable output formats

Disclosures disabled for clean machine-readable responses

Usage-metered and auditable per call

How setup works

01

Sign up and create a new node

02

Select the API / Contract Node profile

03

Define the output schema in the constitution

04

Generate a secret API key

05

Call the /chat endpoint from your service

Try these prompts

Classify this support ticket as billing, technical, or account
Extract the sender, date, and amount from this invoice text
Translate this JSON payload from English to French
Summarise this paragraph in exactly 3 bullet points as JSON

Frequently Asked Questions

API / Contract Node

What makes the API Agent different from a Chat Assistant?
The API Agent is configured for strict, deterministic output. Persona, memory, cross-thread awareness, and disclosures are all disabled by default. The constitution tells the model to respond only with well-formed JSON. This makes it ideal for programmatic callers that need predictable contracts.
Can other services call this agent directly?
Yes. The agent exposes the same /chat REST endpoint as any Interlocute node. Your backend, scripts, or CI/CD pipelines send a POST with a JSON payload and receive structured JSON back. Authentication uses a secret API key.
Is the output really deterministic?
The output is as deterministic as the underlying LLM allows. The agent's constitution, disabled capabilities, and strict prompt constraints push the model toward consistent schema-conforming responses. For hard guarantees, combine with output validation in your calling service.
Can I customise the response schema?
Yes. Edit the node's constitution to describe the expected output schema. The model will follow the schema as instructed. You can update the constitution at any time without redeployment.
How is usage billed for API calls?
Identically to other agents: a platform premium on LLM tokens plus computation charges. Every API call is logged in the usage ledger with cost attribution, making it easy to track spend per integration.

Ready to deploy?

Create your API / Contract Node node in seconds and start building.