Just a heads up: On March 24, 2025, starting at 4:30pm CDT / 19:30 UTC, the site will be undergoing scheduled maintenance for a few hours. During this time, the site might be unavailable for a short while. Thanks for your patience.
×It says it has 1.13 GiB, but then it says repository is over the size limit (2 GB) and will not accept further additions:
Enumerating objects: 25046, done.
Counting objects: 100% (25046/25046), done.
Delta compression using up to 4 threads
Compressing objects: 100% (17794/17794), done.
Writing objects: 100% (25046/25046), 1.13 GiB | 8.51 MiB/s, done.
Total 25046 (delta 7237), reused 24962 (delta 7170)
remote: Resolving deltas: 100% (7237/7237), done.
remote: Checking connectivity: 43, done.
remote: Repository is over the size limit (2 GB) and will not accept further additions.
remote:
remote: Learn how to reduce your repository size: https://confluence.atlassian.com/x/xgMvEw.
To bitbucket.org:geopost/<repo>.git
! [remote rejected] master-live -> master-live (pre-receive hook declined)
error: failed to push some refs to 'git@bitbucket.org:geopost/<repo>.git'
I've also tried to find the site of the repo with:
git gc
git count-objects -vH
And it also says:
size-pack: 1.13 GiB
On the other hand, as an example, we have another repo that is 2.21GB and commits still work fine there.
EDIT: I tried creating and deleting a branch, as it was suggested online and commits work now. It says Repository details: Size1.9 GB on Bitbucket.
Any idea how can I get it down to 1.13 GB, as git tells me this is the actual size of the repo?
Hello @Dana Adriana Zainescu,
This is very likely due to the fact that Git garbage collection is not triggered on every repository update as it is an expensive operation. This is why one of the steps in the guide Mike mentioned above is to request our Support team to run GC.
Eventually it would've been triggered automatically, but for now I forced GC in four repositories under the account you mentioned that matched the size you mentioned and last access date (since you didn't specify exact repo in question). All repositories reduced in size to 1.1-1.3 GB. Please let me know if I guessed the right repo.
Hope this helps. Let me know if you have any questions.
Cheers,
Daniil
Thanks, Daniil.
Have you also done this for dpd.co.uk? Git is telling me it has 2.21 GiB, instead of the 4.6 GB on Bitbucket. Can you have a look please? And I'll also try to reduce the size a bit more after that.
Also, I suppose there's no way of actually increasing the space limit for the repo? As it's a bit hard to restrict commits, since they're sent from the Alfresco WCM host serving the live website.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
No worries.
Have you also done this for dpd.co.uk?
Done just now, Bitbucket is showing 2.3 GB now.
Also, I suppose there's no way of actually increasing the space limit for the repo?
No, unfortunately this one is carved in stone. This is mainly due to the potential performance issues that Git has with larger repositories. There are a couple workarounds for this though: you can make use of Git LFS for large files or split the repo history like it is described in this Git guide chapter. Both ways aren't trivial and might not be easily applicable in your case.
Cheers,
Daniil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
hi @Daniil Penkin , one repo of mine exceeded the limit and blocked me for making more additions.
I rewrite the repo history and now my repository is under 500mb. However, when I push it to the remote it continues to fail with the following error:
Counting objects: 5314, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (1953/1953), done.
Writing objects: 100% (5314/5314), 256.71 MiB | 2.29 MiB/s, done.
Total 5314 (delta 3501), reused 4815 (delta 3055)
remote: Resolving deltas: 100% (3501/3501), done.
remote: Checking connectivity: 5314, done.
remote: Repository is over the size limit (2 GB) and will not accept further additions.
remote:
remote: Learn how to reduce your repository size: https://confluence.atlassian.com/x/xgMvEw.
! [remote rejected] master -> master (pre-receive hook declined)
! [remote rejected] feature/car-mode -> feature/car-mode (pre-receive hook declined)
can you please help me with this?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @letiagoalves,
If I identified your repository correctly, it should reflect the smaller size now. Not under 500mb as you mentioned, but still way under the limit.
Cheers,
Daniil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you for your help. It is reflecting the new size.
The 500mb size I mentioned is the local repo that I was not able to push --force because of the pre-receive hook.
But now I was able to push it.
Thanks again.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @Daniil Penkin ,
I'm in an urgent situation and having an issue with exceeding the limited size of the repository.
Can you also help me with that?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @[deleted],
I think I figured which repository you meant – GC reduced it to around 450 Mb.
Cheers,
Daniil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Daniil Penkin BIG BIG thank you for the help, it's been so frustrating since yesterday.
What can I do to avoid this in the future?
I already enabled the Delete dangling commits when oversize limit.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
No worries, happy to help :)
Aside from enabling Bitbucket feature you mentioned I can only advise to set up client-side Git hook which will raise a flag if you're trying to commit some big file (I'm assuming you accidentally committed some large files to your repository). With such configuration you'll need to explicitly skip that hook in order to commit a large file, should you need that (which would be a rare operation I guess).
It can be a simple script, or some more sophisticated tool. I personally use check-added-large-files plugin for pre-commit, but there're other tools as well.
Let me know if you have any questions.
Cheers,
Daniil
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Please refer to the guide to reducing repo size mentioned in the error message for information on the size limits and instructions on getting down below the 2GB limit.
You may have to request technical support to get your repo below the 2GB limit. You can ask the support engineer why your 2.21GB repo is not affected by the limit.
Once below the limit, the guide to maintaining git repos explains how to remove other large files from your repo.
Apologies for your confusing experience. We are currently investigating some issues with the way repo sizes are calculated. This is a tricky area that has a big performance impact so we're being very cautious, rolling out changes progressively and monitoring their effects carefully.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks for the information. Will have a look over those links!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Atlassian Government Cloud has achieved FedRAMP Authorization at the Moderate level! Join our webinar to learn how you can accelerate mission success and move work forward faster in cloud, all while ensuring your critical data is secure.
Register Now
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.