Anthropic launches Model Context Protocol to standardize AI data integration
Games

Anthropic launches Model Context Protocol to standardize AI data integration


Join our daily and weekly newsletters to get the latest updates and exclusive content on industry-leading AI coverage. Learn more


One decision that many organizations must make when implementing AI use cases involves connecting data sources to the models they use.

Frameworks such as LangChain exist to integrate databases. But developers must write code every time they connect the model to a new data source. anthropology It hopes to change that paradigm by releasing what it hopes will be a standard for data integration.

anthropology release it Model Context Protocol (MCP) As an open source tool to provide users with a standardized way to connect data sources to AI use cases in a Blog postAnthropic says this protocol will serve as The idea is that MCP will allow models like Claude to search the database directly.

Alex Albert, Head of Claude Relationships at Anthropic Said in X The goal of the company is “Creating a world where AI connects to any data source,” with MCP as the “universal translator.”

“Part of what makes MCP effective is that it manages both local resources (databases, files, services) and remote resources (APIs like Slack or GitHub) through the same protocol,” Albert said.

A standardized way to integrate data sources not only makes it easier for developers to point large language models (LLMs) directly to the data, But it also reduces data retrieval problems for organizations building AI agents.

Because MCP is an open source project. The company therefore encourages users to participate. Storage space of connectors and applications

Standards for data integration

There is no standard way to connect data sources to models. This decision rests with enterprise users and model and database providers. Developers often write specific Python code or LangChain instances to point LLMs to the database, because each LLM works slightly differently. Developers therefore need separate code for each to connect to specific data sources. This often results in models Calls to the same database do not work together smoothly.

Other companies extend their databases to make it easier to create vector embeddings that can connect to LLM. One example is Microsoft integrating Azure SQL with Fabric. Smaller companies like Fastn also offer different ways to connect data sources. too

However, Anthropic wants MCP to work beyond Claude as a step towards interoperability of models and resources.

“MCP is an open standard that enables developers to create secure two-way connections between data sources and AI-powered tools. The architecture is straightforward. Developers can expose their data through MCP servers or create AI applications (MCP clients) that connect to these servers,” Anthropic said in a blog post.

many Social media commenters Praise the MCP announcement, especially the open source release of the protocol. Some users on the forum like hacker news Be more careful It questions the value of standards such as MCP.

Of course, MCP is the only standard for the Claude family of models right now. However, Anthropic released pre-built MCP servers for Google Drive, Slack, GitHub, Git, Postgres, and Puppeteer.

VentureBeat has reached out to Anthropic for further comment.

The company says early adopters of MCP include Block and Apollo, with providers like Zed, Replit, Sourcegraph and Codeium working with AI agents that use MCP to get data from data sources.

Any developer interested in MCP will have immediate access to the protocol after installing the pre-built MCP server through the Claude desktop app. Organizations can also build their own MCP server using Python or TypeScript.



Source link

You may also like...

Deixe um comentário

O seu endereço de email não será publicado. Campos obrigatórios marcados com *