Constructing AI brokers is the brand new gold rush. However each developer is aware of the most important bottleneck: getting the AI to truly talk to your knowledge. At the moment, journey big Agoda is tackling this drawback head-on. They’ve formally launched APIAgent, an open-source software designed to show any REST or GraphQL API right into a Mannequin Context Protocol (MCP) server with 0 code and 0 deployments.
The Drawback: The ‘Integration Tax‘
Till just lately, in case you wished your AI agent to test flight costs or lookup a database, you needed to write a customized software. When Anthropic launched the Mannequin Context Protocol (MCP), it created a typical method for Massive Language Fashions (LLMs) to connect with exterior instruments.
Nevertheless, even with MCP, the workflow is tedious. A developer should:
- Write a brand new MCP server in Python or TypeScript.
- Outline each software and its parameters manually.
- Deploy and keep that server.
- Replace the code each time the underlying API adjustments.
Agoda group calls this the ‘integration tax.’ For a corporation with 1000s of inside APIs, writing 1000s of MCP servers will not be real looking. APIAgent is their reply to this scaling drawback.
What’s APIAgent?
APIAgent is a common MCP server. As an alternative of writing customized logic for each API, you employ APIAgent as a proxy. It sits between your LLM (like Claude or GPT-4) and your present APIs.
The software is constructed on a particular technical stack:
- FastMCP: Powers the MCP server layer.
- OpenAI Brokers SDK: Handles the language mannequin orchestration.
- DuckDB: An in-process SQL engine used for SQL post-processing.
The ‘magic’ lies in its skill to know API documentation. You present a definition of your API—utilizing an OpenAPI specification for REST or a schema for GraphQL—and APIAgent handles the remaining.
How It Works?
The structure is simple. APIAgent acts as a gateway. When a consumer asks an AI agent a query, the circulation appears like this:
- The Request: The consumer asks, ‘Present me the highest 10 inns in Bangkok with essentially the most critiques.’
- Schema Introspection: APIAgent robotically inspects the API schema to know the obtainable endpoints and fields.
- The SQL Layer (DuckDB): That is the key sauce. If the API returns 10,000 unsorted rows, APIAgent makes use of DuckDB to filter, kind, and combination that knowledge regionally by way of SQL earlier than sending the concise end result again to the LLM.
- The Response: The JSON knowledge travels again by means of APIAgent, which codecs it for the AI to learn.
This method makes use of Dynamic Device Discovery. You possibly can level APIAgent at any URL, and it robotically generates the required instruments for the LLM with out handbook mapping.
Key Characteristic: ‘Recipe’ Studying
One of many key options is Recipe Studying. When a posh pure language question efficiently executes, APIAgent can extract the hint and reserve it as a ‘Recipe.’
- These recipes are parameterized templates.
- The following time an identical query is requested, APIAgent makes use of the recipe immediately.
- This skips the costly LLM reasoning step, which considerably reduces latency and price.
Key Takeaway
- Common Protocol Bridge: APIAgent acts as a single, open-source proxy that converts any REST or GraphQL API right into a Mannequin Context Protocol (MCP) server. This removes the necessity to write customized boilerplate code or keep particular person MCP servers for each inside microservice.
- Zero-Code Schema Introspection: The software is ‘configuration-first.’ By merely pointing APIAgent at an OpenAPI spec or GraphQL endpoint, it robotically introspects the schema to know endpoints and fields. It then exposes these to the LLM as practical instruments with out handbook mapping.
- Superior SQL Submit-Processing: It integrates DuckDB, an in-process SQL engine, to deal with advanced knowledge manipulation. If an API returns hundreds of unsorted rows or lacks particular filtering, APIAgent makes use of SQL to kind, combination, or be part of the info regionally earlier than delivering a concise reply to the AI.
- Efficiency by way of ‘Recipe Studying’: To unravel excessive latency and LLM prices, the agent options Recipe Studying. It information the profitable execution hint of a pure language question and saves it as a parameterized template.
- Safety-First Structure: The system is ‘Protected by Default,‘ working in a read-only state. Any ‘mutating’ actions (like
POST,PUT, orDELETErequests) are strictly blocked by the proxy until a developer explicitly whitelists them within the YAML configuration file.
Try the PR Right here. Additionally, be happy to observe us on Twitter and don’t neglect to affix our 100k+ ML SubReddit and Subscribe to our Publication. Wait! are you on telegram? now you’ll be able to be part of us on telegram as properly.

