We write documentation in markdown and display it as html. At each commit, we use pipelines to build the documentation, and then deploy it to a webserver via FTP using this command:
- ncftpput -z -R -V -u $FTP_USERNAME -p $FTP_PASSWORD $FTP_URL / _build/html/*
Everything works fine but ncftpput always transfers all files, which takes several minutes, I would like to shorten this deployment.
I tried `lftp` + `mirror` but it relies on:
- timestamp - this is useless since all the files were created by the build process, so they are newer than those on the server
- size - that works, but obviously we have problem with files that changed but their size is the same.
Is there anything that would work via FTP but upload only files with **changed content** (say using CRC sum) and **ignore timestamp**.
(btw I asked this at stackoverflow as well https://stackoverflow.com/questions/54963413/deploying-only-changed-files-via-ftp)
Hello Jiri,
could you please provide me config for bitbucket-pipelines.yml that worked for you in the end? I am solving the very same problem and default FTP-deploy is not allowing me to send only changed files.
Thanks for help
Jakub
Hi Jakub,
no we did not. We are still deploying everything. I tried Git-FTP but it if I use it directly it simply deploys the source files, not the compiled html files.
One thing I was considering is to create another git repository in bitbucket to store the compiled html. We would:
1. clone the current html files from there
2. replace it with the generated one
3. commit the result
4. Run git-ftp in pipelines of that repository
Steps 1-3 mean we would have to clone/commit everything as well, but it would be within bitbucket, so hopefully much faster.
But I haven't got time to try it yet. If you figure anything out, please let me know.
Jiri
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Jiri,
I've seen a few other users use Git-FTP to solve this problem. I'm not too familiar with it myself, but it could be another tool you could look into.
If you use Git and you need to upload your files to an FTP server, Git-ftp can save you some time and bandwidth by uploading only those files that changed since the last upload.
It keeps track of the uploaded files by storing the commit id in a log file on the server. It uses Git to determine which local files have changed.
Thanks,
Phil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Phil,
we use git-ftp in another project to deploy files that are in git.
But IMHO we cannot use it for this: Here, git contains the source files in markdown format. We generate html pages from these source files, and we need to deploy the result of this generation not the original source files.
In most cases, only one or two of those generated files are different from the previous run. And I would like to deploy only those that are changed.
Jiri
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Try **rsync**, it works wonders for your needs, and if you hace a SSH connection with your server, it is far more secure.
You have a pipe for that in the pipeline builder, of you can use the command right away:
- rsync -uazOvv --no-perms --stats --exclude-from=exclude-list.txt $BITBUCKET_CLONE_DIR/** $SSH_USER@$SERVER:$STAGE_PATH
Then you can adjust the params to your need
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.