Are AI Tools the Same as Plugins
The Plugin Model
OpenAI introduced ChatGPT plugins in 2023 as a way for third-party developers to extend ChatGPT's capabilities. Plugins were hosted services that exposed an API described by an OpenAPI manifest. ChatGPT discovered available plugins, loaded their manifests, and invoked them when relevant. The plugin model included a marketplace where users could browse and install plugins, creating an app-store dynamic for AI capabilities.
The plugin model was compelling conceptually but had practical limitations. Discovery was noisy (too many plugins, hard to find useful ones), quality was inconsistent (anyone could publish a plugin), security was challenging (plugins ran on third-party servers with varying trust levels), and the user experience was fragmented (switching between plugins felt awkward). OpenAI deprecated the plugin marketplace in 2024, replacing it with function calling and the Assistants API, which gave developers more direct control.
The Function Calling Model
Function calling, as implemented by Anthropic, OpenAI, and Google, puts the developer in control. You define the tools your application supports, pass them to the model as part of the API request, and handle execution yourself. There is no marketplace, no discovery mechanism, and no third-party hosting. The tools are part of your application, and you control everything about how they work.
This developer-centric model is more work to set up (you define every tool yourself rather than installing from a marketplace) but provides better reliability, security, and user experience. You know exactly what tools are available, what they do, and how they behave because you wrote them. There is no risk of a third-party tool changing its behavior unexpectedly or going offline.
MCP: The New Standard
The Model Context Protocol (MCP) bridges some of the gap between plugins and function calling. Like plugins, MCP tools are hosted on external servers that can be discovered and connected to. Like function calling, MCP provides precise tool definitions with JSON schemas that the model uses for structured invocation. Unlike either, MCP is an open protocol that works across model providers and applications, so a tool defined on an MCP server can be used by Claude Code, Cursor, custom applications, and any other MCP-compatible client.
MCP is where the industry is converging. It provides the extensibility of plugins (tools hosted externally, discoverable by multiple clients) with the control and reliability of function calling (precise schemas, structured invocation, clear execution boundaries). For developers building new tool integrations, MCP is the recommended approach. For tools that only need to work within a single application, direct function calling remains simpler.
Key Architecture Differences
The most important architectural difference is who controls execution. In the plugin model, execution happened on the plugin developer's server. OpenAI sent the request to a third-party URL, the plugin processed it remotely, and the result came back. The application developer (OpenAI, in this case) had limited visibility into what the plugin actually did with the request. In the function calling model, execution happens in your application. You receive the model's structured request, you call whatever backend service you choose, and you return the result. You control every aspect of execution: what services are called, what credentials are used, what validation is applied, and what is returned to the model.
Discovery also differs fundamentally. Plugins were discovered through manifests hosted at well-known URLs. Any application could theoretically find and use any published plugin. Function calling tools are defined by the developer and compiled into each application. There is no discovery mechanism because the developer explicitly chooses which tools to include. MCP splits the difference: tools are hosted on external servers (like plugins) but connected through explicit configuration rather than automatic discovery (like function calling). You add an MCP server to your configuration deliberately; it does not appear automatically.
Security posture is another critical distinction. Plugins required trusting third-party code running on third-party infrastructure with data from your users. Function calling keeps all execution within your security boundary. MCP tools run on external servers, but the connection is explicitly configured and the protocol provides clear boundaries for what the server can access and return. The trend from plugins to function calling to MCP reflects the industry learning that tool extensibility needs to be paired with explicit trust boundaries rather than open marketplace dynamics.
The Terminology Today
In modern usage, "tool" is the standard term for any function that an AI model can invoke. "Plugin" persists in some contexts (browser extensions, WordPress, IDE plugins) but is becoming less common in AI contexts. "Function calling" refers to the specific API mechanism by which models invoke tools. "MCP server" refers to a hosted tool provider that speaks the Model Context Protocol. When someone says their AI "uses tools," they almost always mean function calling or MCP, not the deprecated plugin model.
For developers choosing between approaches today, function calling is the right choice for tools that only your application needs (internal APIs, custom logic, proprietary data access). MCP is the right choice for tools that should be reusable across multiple applications (shared services, third-party integrations, memory systems). Both work through the same underlying mechanism of structured model requests and application-controlled execution. The difference is in how tools are distributed and discovered, not in how they are invoked.
Connect your AI to tools through MCP. Adaptive Recall is available as an MCP server that provides persistent memory to any MCP-compatible application.
Try It Free