Thoughts on the Model Context Protocol Part 2
Thoughts about MCP Part 2
Since Anthropic first open-sourced the Model Context Protocol (MCP) in late 2024, the community has raised concerns about its transport, security, metadata and adoption. The 2025-03-26 MCP spec update addresses most of these—but real-world implementations and host support remain in flux. Below is a concise, critical rundown.
1. Adoption Is No Longer in Doubt
Then: Would OpenAI or Google ever back MCP?
Now: OpenAI’s Agents SDK, ChatGPT Desktop and API support MCP1, and Google’s Claude team has signaled compatibility2.
Note: It’s still hard to know exactly which host apps fully implement the new spec changes; we expect major clients to publish detailed support matrices soon.
2. Transport Is Cloud-Friendly
Then: MCP’s HTTP + SSE transport forced stateful, long-lived streams—hard to scale on serverless platforms.
Now: A Streamable HTTP transport lets each request stand alone or upgrade to a short-lived stream, making MCP servers deployable as standard HTTP functions on AWS Lambda, Azure Functions, etc., without wrestling with SSE.
3. Built-In Security via OAuth 2.1
Then: No standard auth—each server rolled its own, risking inconsistent or weak security.
Now: The spec mandates an OAuth 2.1 flow for remote MCP servers, so users explicitly grant scopes and tokens before any tool call.
Reality check: As of May 2025, few production implementations exist—how OAuth will look in practice will become clear once servers and host clients roll out support.
Local stdio plugins remain outside this flow, so they still rely on OS-level safeguards.
4. Tool Safety Metadata
Then: Tools lacked formal labels—no way to flag “read-only” vs. “destructive,” risking accidental data loss.
Now: Every tool can declare metadata such as readOnly
or destructive
in its description (spec changelog PR #185). Hosts can warn users or agents before running high-impact actions.
5. Easier Setup & Deployment
Then: MCP servers were mainly local subprocesses; remote deployment was tricky.
Now: Streamable HTTP plus official SDKs (Python, JavaScript, Java, .NET, Swift, etc.) and reference servers (e.g. Playwright) make both local and cloud-hosted MCP servers straightforward to launch. One-click demos on AWS Lambda abound.
Bottom line: The 2025-03-26 MCP spec has plugged most major gaps—transport, auth, metadata and adoption are all moving forward. Remaining work lies in proper implementation: securing local plugins, enforcing user consent, polishing UIs and tracking which hosts support the new features. MCP’s rapid evolution and heavyweight backing suggest it might become a sound investment for AI-tool interoperability.