Just a heads up: On March 24, 2025, starting at 4:30pm CDT / 19:30 UTC, the site will be undergoing scheduled maintenance for a few hours. During this time, the site might be unavailable for a short while. Thanks for your patience.
×The requirement is to be able to configure multiple sets of variables (essentially version numbers of tools) to be used by a standard script within the pipeline that will then build and tag multiple images, each with a single version number of those tools (the image building code already accepts the variables and so can build a specific combination and tag the images).
So, when/if we add a new version number of a tool, that image will then be built.
This is similar to the environment feature within Travis CI.
A more specific example:
We build an AWS AMI base image. We need to build images that support PHP+NodeJS+Nginx. We need to build images for each combination of all the different versions of those tools. We add a new version to the list of versions, and a new series of images are built. If the base image (Amazon Linux 2) is updated, then that will be a build of all the images. We expect around 50 combinations.
By doing this, we allow our developers to pick the combinations for their projects and have 1 less thing to worry about.
The current approach is to only support the latest version of everything and so 1 pipeline can do all of that. Add a scheduled run and we're covered (that's what we're doing at the moment).
But now we need to have multiple and parallel versions available and updateable.
We could manually write out the pipeline yaml file, but then that will almost certainly lead to errors.
Is there anything available within BitBucket's pipelines that can get us close to a solution?
One suggestion that we had in the company was if a pipeline could launch multiple/parallel builds that would certainly work for us.
We think not as the BitBucket pipelines seems to be static in their configuration.
The next suggestion was to have a pre-processor that works on changes to a committed config of some sort that generates the pipeline yaml file. That would be one way to do things, but would result in an extra commit if the pre-processor was in the pipeline. There doesn't seem to be git hook support for BitBucket Cloud.
So. What suggestions/options should I explore further?
It is possible to use parallel steps in pipelines:
It is also possible to configure multiple custom pipelines and schedule a run for each one.
I'm not sure I understand your requirement with regards to variables though, I'd like to ask for clarification to make sure I understand your use case:
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.