We have a project with about 1,300 files, totalling about 1.61GB, largest file at 11MB. About 99% of these files are binary files. We're wanting to import this project into Stash/Git. Will something like this need something specially like Git Annex, Git BigFiles, Git Fat?
http://blogs.atlassian.com/2014/05/handle-big-repositories-git/
Hi Mark,
Not necessarily. Although your repository is on the larger side you don't have to start looking into specialised tools just yet if the repository works well for you.
The size of the repository will affect a number of things:
Other things to think about:
In general I wouldn't consider files up to 11MB to be too large to manage with git. It gets difficult when you talk about multiples of that though (e.g. over 50MB it becomes worth looking into alternatives IMHO).
Have you tried importing it to see how it impacts your usage and the instance?
Cheers,
Stefan
One important thing to consider is how often these binary files change. Every time a binary changes, the repo size goes up by almost the full file size (since the diff is almost the entire file).
=> If they change often, you might want to consider putting the binaries into a separate repo where you can (more easily) wipe (some of) the history to reduce the repo size. (This can be mapped as a submodule into the main repo, if necessary.)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.