Migration-scripting is quite tricky to do; it would be great if, when I wanted to make API changes, I could make them manually on a fork of the primary environment, in correpondence with code changes, and then once the code-changes are proven to work I could capture the environment changes as a script that I could re-run…
This feature would be huge for us (or a “promote structure only” option).
It’s also confusing to me that this isn’t more frequently requested, as the workflow for making non-trivial updates to “live” sites is really pretty broken without it. Hand-coding migrations is a solution to the “staging and then promoting updates without losing live content history” problem, but it really undermines some of the core appeal of the product (i.e. rapid development of content models etc.).
@lunelson if you discover or develop a solution for this we would be really keen to discuss, or contribute to creating something.
Thanks @rob yeah I might come back around to this. It seems to me that for the use-case of API-naming and structural changes, without content changes, one could probably write a script to compare all models from two different environments, and format the results as some kind of list, that a second script could then interpret as instructions, to reproduce those changes. Would have to do some hacking around on this
Hello @rob, can you please expand on why do you think that hand-coding migrations is not good enough for live sites?
If I understand correctly, and at least in our use-case, the issue would be that we have to make any model changes on our dev/staging environments via scripts? In order for us to re-run these scripts on a newly forked environment when we want to promote changes.
This defeats, and removes one of the most powerful features of datoCMS, the ability to create and manage the models via a GUI. Maybe I am mis-understanding the process however as I cannot find any official docs on how to manage this process (this video helped me understand the process at the time of writing: Using Scripts to Migrate DatoCMS Content Schema).
As I see it currently, being able to diff environments and create the migration script would just be incredible and really help, and feels like a logical iteration on the current process. Ideally for content as well as models, but models first would be amazing.
Edit: Updated some terminology, and I also found docs! \o/ Primary and sandbox environments - DatoCMS
The main issue with migration scripts as a means of making updates to models of the primary environment is the development flow.
Developers want to make changes to the models via UI.
It takes many iterations and a lot of experimentation to get models right before you settle on the end result. It is impossible to iteratively make changes to a model with the current API. The reason for it is that the API is not stateless. If you want to update the API key name of a newly created model on the secondary env, the first time you must use ‘create’, and the second time you need to switch to ‘update’. The API is not typed, and you don’t know which fields are mandatory, unless you know every method by heart or extensively consult the docs. The quickest and most convenient way you develop is UI.
The ideal workflow for me would be as follows.
- Make a fork ‘feature’ from the primary environment.
- Make model changes to ‘feature’ via the UI.
- Use a proposed cli command ‘data sync-models --source=feature’ to pull the models schema into a file.
- Make changes to the schema in code, if necessary, and use the proposed stateless Dato API to test it by syncing to an environment with CLI.
- Apply the schema to the primary or any other environment during deployment.
Just to be clear, a stateless API would attempt to sync a schema to the environment, creating entities and properties if they don’t exist, updating them if they do exist, removing them if they are undefined in the source schema.
I opened a feature request for this a few days ago: Record migration scripts in Web UI
As I see it is essentially the same as this one and the workflow I envisioned there is very similar to yours, although I had a semi-automatic script recording feature in mind for the technical solution. A fully automated and consistent sync like you described would be even better.
As for the reasons, you stated them perfectly. I agree with your post 100%
I think it is not good enough because composing models with UI is so good in DatoCMS! If it would suck more, the hard-coding option would be much attractive