Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

SSH is Way too slow

danish_dildar
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
October 22, 2025

I am facing way too slow `git pull` from bitbucket; first I thought it's a temporary issue due to the outage on Oct 21st, 2025 but even today, more than 24 hours after bitbucket has posted [status](https://bitbucket.status.atlassian.com/) to be all fine, I'm still facing slow pulls. Anyone can spot any issue as such, here's my logs

```

git pull
19:09:37.290145 git.c:463 trace: built-in: git pull
19:09:37.290568 read-cache.c:2388 performance: 0.000163798 s: read cache .git/index
19:09:37.290708 run-command.c:659 trace: run_command: git fetch --update-head-ok
19:09:37.292800 git.c:463 trace: built-in: git fetch --update-head-ok
19:09:37.293181 read-cache.c:2388 performance: 0.000161463 s: read cache .git/index
19:09:37.293794 run-command.c:659 trace: run_command: unset GIT_PREFIX; GIT_PROTOCOL=version=2 ssh -o SendEnv=GIT_PROTOCOL git@bitbucket.org 'git-upload-pack '\''<xxxx>/<xxxx>.git'\'''
19:16:25.260795 run-command.c:659 trace: run_command: git rev-list --objects --stdin --not --exclude-hidden=fetch --all --quiet --alternate-refs
19:16:25.649539 run-command.c:1523 run_processes_parallel: preparing to run up to 1 tasks
19:16:25.649568 run-command.c:1551 run_processes_parallel: done
19:16:25.649579 run-command.c:659 trace: run_command: git maintenance run --auto --no-quiet
19:16:25.651973 git.c:463 trace: built-in: git maintenance run --auto --no-quiet
19:16:25.652475 trace.c:414 performance: 0.000709010 s: git command: /usr/lib/git-core/git maintenance run --auto --no-quiet
19:16:25.652903 trace.c:414 performance: 408.360316731 s: git command: /usr/lib/git-core/git fetch --update-head-ok
19:16:25.654052 run-command.c:659 trace: run_command: git merge FETCH_HEAD
19:16:25.656123 git.c:463 trace: built-in: git merge FETCH_HEAD
19:16:25.656813 read-cache.c:2388 performance: 0.000135759 s: read cache .git/index
Already up to date.
19:16:25.657600 trace.c:414 performance: 0.001661629 s: git command: /usr/lib/git-core/git merge FETCH_HEAD
19:16:25.658022 trace.c:414 performance: 408.368082146 s: git command: git pull


```

as you can notice it took about 7 minutes to find out the repository is >Already up to date.

1 answer

1 accepted

0 votes
Answer accepted
danish_dildar
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
October 23, 2025

I was able to resolve the issue based on suspicion that somehow, my ISP had stopped supporting IPv6, all of a sudden on the same day when AWS experienced outage on October 20th, 2025.

Solution
force IPv4 for ssh (this solution is for system-wide ssh)

file: /etc/ssh/ssh_config

before:
`#   AddressFamily`

after:
`AddressFamily inet`

 

I wouldn't have imagined that an issue from December 2018 (https://community.atlassian.com/forums/Bitbucket-questions/Slow-SSH-clone-pull-push/qaq-p/953843) would have similar culprit because everything was working fine, then at the coinciding timeline with AWS outage it would stopped working, anyways maybe AWS ripples somehow touch my ISP's IPv6 capabilities.

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
PRODUCT PLAN
STANDARD
TAGS
AUG Leaders

Atlassian Community Events