Hello, we have a Confluence DB that is about 11GB of data and the LINKS table is about 7GB of data. We have over 50 million LINKS but only 53K CONTENT rows. Some content ids have several million links associated with them. I believe this is why our index rebuild is not finishing. Is there any safe links cleanup we can do?
Thanks!
Hello,
same case of the problem here!
I'm supporting a Confluence installation that contains over 5 million entries in LINKS table, but only about 30.000 entries in CONTENT. I've never seen this relation befor.
The re-index is working fine on this installation. But the XML export is absolutely useless with default settings.
I need to give tomcat 8GB of memory to prevent a crash. - And than it still needs over 2h! to export to XML (without attachments!).
Now the question is: Where are all these links comming from?
And i have the same question as @Dragon Moon: "Is there any safe links cleanup i can do?"
Did it hang at certain percentage? There are few possibilities why index is failing/not completed and it includes the followings but I'm not too sure if LINKS has something to do with it and it also depends on what was thrown in <confluence-home>/logs/atlassian-confluence.log:
What error can be seen in the logs related to index?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.