Search the Community

Showing results for tags 'memory'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Resilio Sync
    • Sync General Discussion
    • Sync Troubleshooting
    • Sync for NAS (Network Attached Storage)
    • Sync Stories
    • Developers
    • Feature Requests

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 11 results

  1. I've installed Resilio natively from website using manual package download (apollolake), but also tried using official docker image (hub.docker.com/r/resilio/sync/), both have a 'problem' that after a short window the sync speed caps at around 1-2mb/s. After installing the docker version I happened to increase the ram allocation just trying things that might help and it did, straight way I was getting 20-30mb/s HURRAY! Well I was celebrating too soon, as the next file I tested was back to 1-2mb/sec, looking at the docker overview you can see the ram usage slow rise from 0 to ~10gb (I'd se
  2. It is well known that the large amount of memory consumed Resilio since it uses a few kilobytes of RAM per file. It would be very good if Resilio used a database as a sqlite to store this information instead of in memory. If not, Resilio is not usable on NAS systems or computers with a normal amount of memory if the number of files is important.
  3. So I have sync installed on a linux server acting as a remote host for my files, but I'm having some problems with the amount of memory and CPU it's consuming. It's hosted on a DreamHost VPS, so the biggest problem is when I reach my limits they will either force upgrade my tier of hosting or reboot my server so I don't continue using over the amount I pay for. Is there a way to limit the CPU/Memory usage? Here's an excerpt from my processes USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 11290 109 31.1 447844 237124 ? Ssl 17:08 9:41 /BTS
  4. Being able to set a max CPU use on a machine. When adding new folders the CPU use can be 50-80% until indexes and first sync is finished. Limiting bandwith does not help in any way. /Jesper
  5. Hi all! Our company has websites, which host millions of files. These are 10-50 KB each. One Windows server carries around 100 million jpeg-images with a total of around 1 TB disk space. To have a fail-over (or a round robin), we would like to sync files between Windows servers (WAN, not LAN). HOwever, we saw that BitTorrent sync requires around 300 bytes / file. This would mean a memory requirement of 30 GB for BitTorrent alone. Is there a solution to this? Thanks so much!
  6. I had to change one of our main server for a server with Windows 2008 R2 and since then, I'm struggling with a huge memory leak. The more BTS is using memory the less transfer speed I get. Here's a description of the shares Folder 1 = 594,4 GB in 345 files R/W Folder 2 = 5.5MB in 205 files R/W Folder 3 = 344.4 MB in 3 files R/W Folder 4 = 45.5 MB in 13 files R/W Folder 5 = 721.7 GB in 316 files R/W With Windows 2008, after 10 hours BTS is using around 1800MB of memory and the transfer speed is below 1KB/s. After a while, 1 day or 2, BTS close silently without information in the sync.log
  7. I'm running btsync on a VPS with 2G of memory (1.8GB shown as free on reboot using free -h). When I start btsync the memory gradually drops to ~25MB. I'm syncing ~200GB from multiple sources, covering about 140,000 files. The log file doesn't seem to show anything interesting - there are a bunch of "ReadFile error" messages for some iPhoto/Aperture files that I guess it struggles to get a lock on, and some errors such as: [20140113 11:16:45.622] UPnP: Could not map UPnP Port on this pass, retrying. [20140113 11:16:50.618] UPnP: Unable to map port 37.187.98.29:62573 with UPnP. Is this amo
  8. I have a situation where I am trying to backup files from about 30 remote locations to a central location at my main office and BTSync is running fine at the remote facilities. The problem comes in when I want to setup the client at the central location. Adding it all up the total file count from all the remote locations reaches a few million and as many know and some have experienced BTSync tends to get slugish and unresponsive when the file count gets too high. My solution was to create separate user accounts to be able to run multiple BTSync clients and put four shared folders on each c
  9. nice to see the new version takes care of CPU usage and memory footprint can someone share some stats about the new hardware usage? does it still scale with the number of files?
  10. For thous of you that don't know 32bit process's are limited to 4gb max however windows only alowes 32bit applications to use 2gb, like btsync. So if you have a large sync directory and are getting the out of memory error. (like me) You can use the below program to enable all 4gb so btsync can use it. Not sure if this will work with 32bit windows very well but feel free to try. You will probably crash it. http://ntcore.com/4gb_patch.php I was battling this feature for a few day then I remember this useful little app. Just though i would share it. Bittorent guys please give us a 64bit sync pro
  11. System: imac 2.66 ghz i5 12 GB ram 1GB hd osx 10.8.2 Bittorrent Sync