I have enabled LFS with git and Bitbucket. I am now able to push a ZIP file with more than 100MB which is fantastic, but I am a little worried I will blot Bitbucket over time. The ZIP file is an export of an application component that is deployed to a workflow platform similar to Flowable.
The questions I have are the following:
If the process flow above is problematic for Bitbucket, I am thinking to write a script that will use REST API to export the application component model from the workflow system, and unzip the file (it's all text in JSON and XML), and push the model to Bitbucket. Each ZIP file may contain literally hundreds of files (JSON and XML text files) that describe the application parts. The only problem with this approach is that it will take a long time, and there is a large number of changes that are not related to the programmatic features of the model. Such changes are to flag the metadata of each part within the component which is really not relevant at all to the programmatic change (application implementation details). I once did an experiment to make the application change to the unzipped files, zipped them back, and deployed the model to the workflow, and all was fine.
I appreciate your feedback.
Tarek
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.