Support MCP?

Hi everyone!

I think it would be great if you added support for MCP(Model Context Protocol) and embeddings for records.

This will help to significantly improve and speed up content development.

Hi @Flumi,

Thanks for the suggestion!

AI is something we’ve been slow to embrace, taking a “wait and see” approach instead. However, it’s something we will keep evaluating based on customer demand. So far, you’re the third customer I’ve seen ask about this, so it’s growing slowly but steadily :slight_smile:

Can I ask, please, how you envision MCP working with DatoCMS?

An obvious use case might be exposing our read-only GraphQL layer via MCP, hopefully allowing you to make natural-language queries like, “Which products cost less than $50 and have been featured in a sales blog post within the last year?”

Is that kind of what you’re hoping for too, or was there a particular use case that you wanted MCP to enable?

What I would like to see is the possibility for our clients to create articles through a chat interface or another tool connected via MCP. The user could upload a Word file, and the AI would then be able to generate an article based on the available modules for the selected content type. In a software like Cursor, a developer might also be able to scaffold a project from the project structure.

2 Likes

Hello Roger!

Thank you for your response.

We are already using AI to improve our content in the project. Currently, it works via API and webhooks. I would like to tell you about the use cases where we’re implementing it and suggest features that could enhance its capabilities.

How are we using it?

  1. For image analysis, every image added by an editor is thoroughly described by the LLM, which helps when we analyze other content to keep all the content in the record within the LLM’s context.
  2. For article analysis, the LLM analyzes the article and suggests improvements. For example, it finds inaccuracies, stylistic or spelling errors, etc. In the future, we also plan to add functionality where the LLM analyzes all articles and indicates where links to other articles could be added when we’re editing a specific article (the Wikipedia effect).
  3. We also create article summaries.

What could be improved?
I would like it so that when a webhook sends data about media changes (UPLOAD), it would show the old and new model just like in records (RECORD). Currently, it’s easy to create a webhook recursion when you update an asset because there’s no way to see the changes.

Why is MCP needed?
It can be connected to Claude and can immediately help improve an article.

Ideally, it would be great to have a Copilot directly in DatoCMS, so there would be no need to copy content into a chatbot for improvements, etc.

1 Like

This would also be great as something developers could add to their code copilots.

In that case, the ideal tools would be the ability to run migrations, fork environments etc. A situation where you could describe a content type or block type, an extra field, a change to some options, to your copilot and it could apply it directly in Dato.

(In general, I feel like these changes would be a lot simpler to automate, get AI assistance for, meta-program etc if there was a yaml-ish system for describing content structure. My dream for dato would be to be able to dump and import a declarative schema.rb type representation of all my fields and field config etc.)

Thank you for the ongoing feedback here, folks! Understanding your use cases is very helpful — for example, knowing that writes are important, not just reads, as well as the ability to interact with media, schema, and metadata. Please keep it coming :slight_smile:

We still don’t have anything officially in the pipeline yet, but a few of our devs have been playing around with MCP on their own, just as a test (which you can do too, by writing your own MCP server against the existing Dato APIs). So far, they’ve just been isolated experiments, but it’s a first baby step…

I know this feels slow. Most of the industry is jumping head-first into everything AI (and has been for the last couple years). But we’re taking the time to more carefully consider our approach. We don’t want to just jump on the bandwagon blindly, rushing development for the sake of hype and risking jank… there’s enough slop out there already :sweat_smile:

On the support side, I cannot speak for the company as a whole on this topic, but I do see that the devs have been discussing it and experimenting and evaluating. We all know the demand is there, and we’re not ignoring it, we just want to take a more deliberate and measured approach.

Your patience is greatly appreciated. The bots may move fast, but we’re still just a small, flesh-and-bone team. We’ll do our best to provide updates as we’re able!

+1 to this.
To expand on use cases, for our team it would be really valuable for write access as well for large scale patterned changes. Let’s say we have links to a particular page in a bunch of blog posts. It would useful to ask an LLM to change all instances of that link to something else. Would speed things up quite a bit compared to scripting.

We’re actively working on this. This is just a sneak peek to amaze you as much as we are right now… :slight_smile:

3 Likes