In the description the AWS S3 deploy pipe promises to: "Recursively copies new and updated files from the source local directory to the destination."
However the pipe always copies all files from the VCS to the S3 bucket even if they are untouched. Is there something I'm doing wrong?
+1 on this , it ends up massively increasing your s3 put costs as well. There really needs to be a way to only Sync "changed" files, unless anyone has found a way around it?
I see that other people have been asking the same question. Could it be possible to create the changed file detection based on the GIT change log? GIT knows exactly which files have been changed and thus there would be an ability to improve the pipe to fulfill its promises.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.