MCP (Model Context Protocol)
The Model Context Protocol, or MCP, is an open standard introduced by Anthropic in late 2024 that defines how AI models connect to external data sources and tools. It provides a uniform interface — a shared protocol — so that a tool built for one model or framework can work with any other that supports MCP, without custom integration code for each combination. Before MCP, every model provider had its own tool use format. OpenAI's function calling, Anthropic's tool use, Google's function declarations, and LangChain's tool abstractions all described essentially the same concept — "here is a function the model can call" — but with incompatible schemas and execution patterns. MCP addresses this fragmentation by defining a standard way to describe tools (what they do, what parameters they accept), resources (data the model can read), and prompts (templates for common interactions). The protocol follows a client-server architecture. An MCP server exposes capabilities — search functions, database queries, file operations, web scraping tools — through a standardized interface. An MCP client, embedded in an AI application or agent framework, discovers available servers, presents their capabilities to the model, and routes tool calls to the appropriate server. This separation means a single MCP server for web search can serve Claude, GPT, Gemini, or any local model without modification. For product builders, MCP matters because it reduces the integration cost of connecting AI applications to external services. Instead of building and maintaining separate integrations for each model you support, you build one MCP server. Several web data companies have already released MCP servers: Firecrawl, Browserbase, and others offer MCP-compatible interfaces that let agents crawl websites, extract content, or automate browsers through the standard protocol. The standard is still early. Adoption is growing but not universal, and the specification continues to evolve. Product teams should evaluate MCP support as a factor when choosing web data tools, particularly if they plan to support multiple LLM providers or want their integrations to be portable across frameworks.