Tool Use
Tool use, also called function calling, is the ability of a language model to invoke external functions or APIs as part of generating a response. Rather than attempting to answer every question from its training data alone, a model with tool use can call a search API to look up current information, query a database for specific records, execute code to perform calculations, or interact with any service that exposes a programmatic interface.
The mechanism works through a structured protocol. The developer defines a set of available tools – each described by a name, purpose, and parameter schema. When the model determines that a tool would help answer the user's query, it generates a structured tool call (typically JSON) specifying which function to invoke and with what arguments. The application executes the call, returns the result to the model, and the model incorporates that result into its response.
OpenAI introduced function calling in mid-2023, and every major model provider now supports some version of it. Anthropic's Claude uses a tool use protocol, Google's Gemini supports function declarations, and open-source models increasingly support tool calling through standardized formats. The Model Context Protocol (MCP) is an emerging standard that aims to unify how tools are described and connected to models.
For AI product builders, tool use is the mechanism that turns a language model from a text generator into a capable application backend. A customer support bot that can look up order status, a research assistant that can search the web and read documents, a data analyst that can query databases – all of these are tool use applications. The quality of your tools directly determines the quality of your product.
In the web data context, tool use is how agents access scraping APIs, search engines, and browser automation services. When an agent decides it needs current pricing data from a competitor's website, it generates a tool call to a scraping API. When it needs to verify a claim, it calls a search API. The web data tools covered on serp.fast are, in practice, the tools that AI agents use.
Tools that handle tool use
4 tools in the serp.fast directory are commonly used for tool use workflows, spanning ai-native search apis, web crawl & data extraction apis. Each is reviewed independently with pricing and editorial assessment.
Neural search engine using embeddings-based next-link prediction – finds semantically similar content, not just keyword matches.
Real-time search API purpose-built for AI agents and RAG pipelines, now owned by Nebius Group.
Converts websites to LLM-ready markdown via API, with crawling, extraction, search, and an agent endpoint – the Swiss Army knife of AI web data.
Full-stack scraping platform with a marketplace of 10K+ pre-built scrapers (Actors). The platform is commercial; their Crawlee framework is separately open source.