I've been trying (for 2 days) to sync between two Win8/Win7 computers on the same network a large and unwieldy folder set. There are about 400k files, in only 60GB of space. It's old directories full of subversion remnants, massive data export dumps in single files, things like that. It has been impossible with BTsync. Fair enough, it was a tough (unusual) task. -> metacache! NOT a great idea. Basically, the entire problem was exaggerated by the need to have 100,000s of files in your metacache structure. It might be perhaps a better idea, rather than to shard them as you have done, to merge them into file blocks. This would free up much of the MFT / Filesystem resource intensive work needed to sync anything large. I'm not sure, but it seems like you maintain a separate cache item for each item we need to sync. That's doubling the effort required. Just my two cents, I think the overall idea is great - can't wait to get to use it. Cheers, Frank. Edit: Sorry I have no debug logs for you, most things ended in a forced close - I noticed these things also: -Some files were missing, but still trying to sync (cache out of sync?) -Massive changes in used memory >2GB at certain times then back to 500mb -Out of memory exception at one point, didn't catch it.