Search the Community

Showing results for tags 'memory'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Resilio Sync
    • Sync General Discussion
    • Sync Troubleshooting
    • Sync for NAS (Network Attached Storage)
    • Sync Stories
    • Developers
    • Feature Requests

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 17 results

  1. I've installed Resilio natively from website using manual package download (apollolake), but also tried using official docker image (hub.docker.com/r/resilio/sync/), both have a 'problem' that after a short window the sync speed caps at around 1-2mb/s. After installing the docker version I happened to increase the ram allocation just trying things that might help and it did, straight way I was getting 20-30mb/s HURRAY! Well I was celebrating too soon, as the next file I tested was back to 1-2mb/sec, looking at the docker overview you can see the ram usage slow rise from 0 to ~10gb (I'd set 12gb max I have 16 installed). Finishing the transfer cleared maybe 100mb of ram and the rest just sits there. All transfers while the allocated ram is full (or around this point of 10/12gb allocated) are at 1-2mb/sec speed, if I restart the docker container then I can get full speed back until I max out the ram again at 10gb. I assume this was the issue also affecting the native install just I couldn't see the numbers as easily. If anyone can help resolve this I would be very grateful been looking at this problem for a week thinking it was my connection I'm currently using a free account so cant submit a support ticket but would happily purchase a license if an answer is out there. EDIT: Decided to buy a licence as on offer and with 30 money back if cant resolve this problem, also appears you can submit tickets with free account... Thanks. Sync version: 2.6.1 (1319) Synology Spec:
  2. It is well known that the large amount of memory consumed Resilio since it uses a few kilobytes of RAM per file. It would be very good if Resilio used a database as a sqlite to store this information instead of in memory. If not, Resilio is not usable on NAS systems or computers with a normal amount of memory if the number of files is important.
  3. JFlin

    Memory Limit

    Give an advanced option to provide a MAX amount of memory for Sync to use, and stay within it! Don't care how, understand desire to keep everything in memory, but scenario is a 4 terabyte drive backup (Among other things, of course). Included on this drive are source code files, music files, video files, and word documents. Combination of many very small and very large files, and find that BTSync seems to use a LOT of memory (1,968,184 kb). I'd like to keep this under 1/2 meg or so, and am quite willing to accept a performance penalty to get there.
  4. I use resilio (prior bitSync) several moths now. I'm quite pleased with the operation, BUT I now face a major dilema that is making it unusable. Most of my storage maintained in synchronism is from Electronic CAD that generates a tremendous amount of files (usually quite small). So asingle project with 10k files is not uncomon! So, when I look at memory consumption of resilio and saw 500MB (on all in-sync computers)... the usability scenario is just impossible. A working computer with memory-ungry CAD applications CANNOT spare such amount of memory just to keep in sync. Is it possible to have a mitigation solution for this issue?
  5. I have installed BitSync as a service on a Windows 2012R2 server. After a day of being online, the btsync process has some 20Gb of private bytes allocated and 7Gb of Working Set. If I don't restart the service, the system becomes unusable (as expected). Any ideas?
  6. I have a Windows 2012 server with BT sync installed. I am noticing that the interface is very slow to respond, so slow that Task manager is frequently reporting 'Not Responding' for about 1 second, then 10 seconds ok, then 'Not responding' etc etc. It is very painful to do anything as a single 'click' on an item can take 30-60 seconds or more to respond, making it a joke to do any management. Please help!! The server has 16gb memory and this sync version is 2.0 I have several QNAP Nas devices with a similar problem although not quite so extreme.... Is there any way we can manage BT Sync outside the UI, or find some way of speeding up the interface. Other programs on the same machine are fine performance-wise! Any help would be appreciated.
  7. Hi I've got a setup with a Linux (x64) server which I keep online 24x7 with Internet access acting as "cloud" server, two Macs, one Android phone and an iPad. I recently added a Synology DS414j (arm) to the mix, planning to replace the Linux server. Using sync 2.0.120 on all platforms (except Android/iOS, but also latest versions). The Linux server is a Xen virtual machine. Initially when I was testing with a couple of folders it had only 256Mb of memory assigned and everything worked. As I started adding more files (specially while syncing a 100Gb folder) it started issuing the "Cannot identify the destination folder" error every now and then. I increased the VM memory to 1Gb a couple of weeks ago and haven't had any issues since. Now the Synology is getting the same error all the time while syncing, and it also has low memory, just 512Mb. After a couple of days I managed to get everything in sync without errors by removing all folders and adding them one by one, so it was only syncing one shared folder at a time. But after a couple of days, one of the folders is showing the "Cannot identify the destination folder" again. Could this error be caused by low memory? Because I haven't got this message in any of the Macs as far as I remember. Either that or a bug only affecting Linux, no matter the architecture (x86 and arm). Also (related maybe?) I noticed my .sync/ID files are getting corrupted. By the description of this file in the documentation I gather it's just a crypto-generated ID unique to each folder, which should remain unchanged through the life of the folder and in sync across systems sharing that same folder, right? Well, sometimes I get things like this: ls -l .sync/ID-rw-r--r-- 1 vicente www-data 2723151965 jun 2 21:29 .sync/ID Check the size of that thing! If I check the contents there's the usual binary ID stuff and then several of these: [20150602 21:29:36.231] assert failed /mnt/jenkins/workspace/Build-Sync-x64/fileutil.cpp:109[20150602 21:29:36.231] assert failed /mnt/jenkins/workspace/Build-Sync-x64/wincompat.h:409 I'm also getting those errors on the logs, on all systems (Macs/Linux/Synology), but these were on the .sync/ID file!!! Now the interesting bit: even if the .sync/ID file has that garbage at the end, as long as the initial binary key is OK, the folder syncs. I checked the folder with the "Cannot identify the destination folder" error on the Synology and the binary part is gone, there's just the assert log message on .sync/ID. Could this be the source of the issue? Somehow some logs go to .sync/ID instead of the log (some thread-unsafe library in use maybe?) and if ID is completely overwritten instead of appended, sync breaks? Regards
  8. Jariway

    Memory Issue

    Hello everyone, I'm Mikel and I'm from Spain, but I'll try to explain the issue in english as best as I can. I've been using Bittorrent sync since the beta release, but I'm having a new issue now. When I start Bittorrent Sync, the PC goes very slow, it seems Sync is taking all the RAM memory. Sometimes my PC freezes when I start Sync, so I have to restart the PC, and If I don't have to restart it, It appears an error, in Spanish is "No hay memoria suficiente", and in english is something like "There is not enough memory". I'm currently using Bittorrent Sync 1.4 but I tried fixing this problem updating to 2.0, 2.0.1.... but nothing changes. NOTE: I tried to enabling Debug Logging, but it is impossible for me just because the PC/Program freezes! My PC works smoothly, and I can see I have enough memory! I hope you can help me!
  9. So I have sync installed on a linux server acting as a remote host for my files, but I'm having some problems with the amount of memory and CPU it's consuming. It's hosted on a DreamHost VPS, so the biggest problem is when I reach my limits they will either force upgrade my tier of hosting or reboot my server so I don't continue using over the amount I pay for. Is there a way to limit the CPU/Memory usage? Here's an excerpt from my processes USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 11290 109 31.1 447844 237124 ? Ssl 17:08 9:41 /BTSync/btsync As you can see, it's using over 100% of my CPU and 31% of my memory. This is usually only the case when it's indexing folders or uploading from multiple computers. But the problem is they'll end up rebooting and causing hiccups in our operation because we're unable to sync. Any ideas? I just updated sync to the latest version (1.3.94)
  10. jesperordrup

    More Control Over Resource Usage

    Being able to set a max CPU use on a machine. When adding new folders the CPU use can be 50-80% until indexes and first sync is finished. Limiting bandwith does not help in any way. /Jesper
  11. Hi all! Our company has websites, which host millions of files. These are 10-50 KB each. One Windows server carries around 100 million jpeg-images with a total of around 1 TB disk space. To have a fail-over (or a round robin), we would like to sync files between Windows servers (WAN, not LAN). HOwever, we saw that BitTorrent sync requires around 300 bytes / file. This would mean a memory requirement of 30 GB for BitTorrent alone. Is there a solution to this? Thanks so much!
  12. I had to change one of our main server for a server with Windows 2008 R2 and since then, I'm struggling with a huge memory leak. The more BTS is using memory the less transfer speed I get. Here's a description of the shares Folder 1 = 594,4 GB in 345 files R/W Folder 2 = 5.5MB in 205 files R/W Folder 3 = 344.4 MB in 3 files R/W Folder 4 = 45.5 MB in 13 files R/W Folder 5 = 721.7 GB in 316 files R/W With Windows 2008, after 10 hours BTS is using around 1800MB of memory and the transfer speed is below 1KB/s. After a while, 1 day or 2, BTS close silently without information in the sync.log or in the Windows event viewer. Same computer/same share but with Windows 8.1, after 10 hours BTS is using 120MB of memory and the transfer speed is around 1.5 - 2 MB/s *** Note that we got over 40 clients with windows 2008 R2 with R/O share. BTS is using around 100MB of memory so, it seems somehow related to R/W share? I've done a lot of testing and it seems that effectively BTS has a memory leak when serving R/W share with Windows 2008 R2 Here's a description of the equipments I used: First computer: - Xeon X3220 - 6 GB of RAM - 1 x 500 GB (OS drive) - 3 x 2TB in Raid 5 (Drive that contain shares) - Windows 2008 R2 & Windows 8.1 Second computer: - I7-4770 - 8 GB of RAM - 1 x 500 GB (OS drive) - 1 x 2TB (Drive that contain shares) - Windows 2008 R2 & Windows 8.1 The first computer is the new server that I wanted to use. ** It's a little bit old but it should be able to handle BTS very well, in fact it’s working great with Win 8.1!! I thought that maybe the problem was with Windows 2008 having some kind of hardware incompatibilities. So, I tried with the second computer, totally different & newer! The result is almost identical as for the first computer. I also thought that the problem could be with multiple share but even with a single share, the leak is present. It just take longer before BTS used all the memory and die quietly. Except for normal operations, there’s nothing helpful in the debug log.
  13. verloren

    Memory Usage

    I'm running btsync on a VPS with 2G of memory (1.8GB shown as free on reboot using free -h). When I start btsync the memory gradually drops to ~25MB. I'm syncing ~200GB from multiple sources, covering about 140,000 files. The log file doesn't seem to show anything interesting - there are a bunch of "ReadFile error" messages for some iPhoto/Aperture files that I guess it struggles to get a lock on, and some errors such as: [20140113 11:16:45.622] UPnP: Could not map UPnP Port on this pass, retrying. [20140113 11:16:50.618] UPnP: Unable to map port 37.187.98.29:62573 with UPnP. Is this amount of memory usage normal, and if not what can I do about it? Cheers, Paul
  14. I have a situation where I am trying to backup files from about 30 remote locations to a central location at my main office and BTSync is running fine at the remote facilities. The problem comes in when I want to setup the client at the central location. Adding it all up the total file count from all the remote locations reaches a few million and as many know and some have experienced BTSync tends to get slugish and unresponsive when the file count gets too high. My solution was to create separate user accounts to be able to run multiple BTSync clients and put four shared folders on each client. This works somewhat, I had no problem logging in with each of the users, adding the folders and they works as long as the users is actually logged in. I know BTSync will use a lot of RAM in my case, but I believe 24GB will handle it. My next step was to have all of the clients started by the Task Scheduler at system startup, because who wants to have to log in to 8 different accounts whenever the backup server is restarted. This is where my solution breaks, it seems that only one instance of BTSync can started by the Task Scheduler. Each scheduled task is launched under a different username, and I can login interactively with each of those users, start BTSync and it will operate as long as that use is logged in. Does anyone have tips or advice for launching multiple instances of BTsync without having to have multiple users logged in to run each instance?
  15. nice to see the new version takes care of CPU usage and memory footprint can someone share some stats about the new hardware usage? does it still scale with the number of files?
  16. For thous of you that don't know 32bit process's are limited to 4gb max however windows only alowes 32bit applications to use 2gb, like btsync. So if you have a large sync directory and are getting the out of memory error. (like me) You can use the below program to enable all 4gb so btsync can use it. Not sure if this will work with 32bit windows very well but feel free to try. You will probably crash it. http://ntcore.com/4gb_patch.php I was battling this feature for a few day then I remember this useful little app. Just though i would share it. Bittorent guys please give us a 64bit sync program for windows, so we don't hit this limitation. Other wise great app even though its beta.
  17. wylywade

    very high IO

    System: imac 2.66 ghz i5 12 GB ram 1GB hd osx 10.8.2 Bittorrent Sync