Search the Community

Showing results for tags 'large share'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Resilio Sync
    • Sync General Discussion
    • Sync Troubleshooting
    • Sync for NAS (Network Attached Storage)
    • Sync Stories
    • Developers
    • Feature Requests

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start





Website URL







Found 1 result

  1. First off thank you in advance for assistance and love the product and if I can get this to work as I hope, I will be purchasing Pro for sure. (I'm on 30 day trial right now) Quick Facts - I have 2 NAS (Xeon CPU) unRAID servers and both are running Docker with Sync as a container. Both running Sync 2.4.4. - The idea here is server #1 is my "master" all read/write happens on this server and server #2 is strictly read only from server #1. - I have 6 folders created on server #1 shared as read only to server #2. The size of these folders range from 11GB to 13TB. - Because both servers were sort of mirrors of each other before (more manually) most files are on both servers with same folder structure. - I'm trying to introduce sync now so it will be automated. So after adding the 6 folders to server #1 all the folders indexed correctly (took about 30 hours) after that I added each folder one by one to server #2. Upon accepting the folder it asks are you sure because the destination folder already has files, I click yes and the "syncing" starts. Now because most items are identical, on the first pass it didn't do much transferring most of it was just rechecks and what not. Rinse and repeat for the next 4 folders. Now here is where my issue starts. My final folder is 13TBs and about 48,000 files. The indexing of this folder on server #1 went just fine. However when I share the folder as read only with server #2 and the indexing/recheck starts it goes for about 10 to 12 hours and then just stops. Usually around 7.5TBs or so but never finishes. I looked in the logs and didn't really see anything. I've removed the folder and re-connected it twice now and it's the same result. It's running again right now for the 3rd time and so far so good (about 2TBs in). So I don't have any logs to share just yet and I'll make sure I turn debug on during this process. I'm just curious if anyone has run into this and is there any setting or feature I'm missing? Hope to post more info tonight. Thanks, Daniel!