Automatic Coding

Members
  • Posts

    218
  • Joined

  • Last visited

  • Days Won

    8

Everything posted by Automatic Coding

  1. For starters, I'm confused what the point of the singular 1Gbit ethernet cable is for if there's only one (E.G. limited by the one you're connecting it through) Second of all, I'd check to see if any programs are running of the router (I had an old netgear box that was stuck at 100% CPU and routing got really slow) Third of all, make sure that your 100Mbit ethernet sockets are set to 100Mbit, my router came with them all set to 10Mbit by default (Including the Gbit ones, which, was weird). Four, assuming that it supports backing up files through a default configuration (So you can revert back to your correct configuration), I'd try resetting to defaults, the option looks like this ("keep user configuration"), although, I have no idea if you have said option:- Other than that, I'm out. I'm not very good with it comes to testing hardware. If you router supports a bandwidth test, I'd also run it from the router's point of view (To see if it's an issue from windows --> Router or router --> ubuntu), although, not many routers support this. In my case it's under "Tools" and "Bandwidth test server":- Although, you might be able to pull it off with something like TCPSpray. Other than that, I'm flat out of ideas, I'm terrible when it comes to diagnosis.
  2. First of all, can we have a debug log? P.S. I'm not sure if this log contains personal information/secrets, so, I'd skim through it first. Second, is it just one machine that can't see the other? Or both that can't see either? I had an issue where one device could see the other, but, the other couldn't see the first, I fixed it by port forwarding the other (even though the other was port forwarded).
  3. Plex media client exists on windows and mac, and, allows you to run without anything like a browser. Plex home theater exists on windows, mac and linux, and, is a beta version of an upgraded Plex media client. I currently have two HTPCs that use Plex home theater on ubuntu, they work great. For starters, yes, there obviously is a internet issue. Now, I'm confused what you mean by "INNBOX"? Can you give me the model number (or a link to the manufacturer's site) of this "INNBOX"? I do admit I'm not very advanced in networking, but, I've never heard of a piece of hardware called an "INNBOX". Another thing is, with WIFI, you're limited based on what mode you're on. Once again, I'm not very good at WIFI but I believe it's something like (depending on the mode) 5Mbps/11Mbps/54Mbps.
  4. I just assumed that the API would be for remote modification of shares without having the full files downloaded (E.G. for my service to modify your shares without having your shares, so, I can't run my MD5 hashing software on a file I don't have), if I'm incorrect, then, it'd help to know how the API would be used (Which, I asked before but apparently you don't know yourself). EDIT:- Unless you're talking about launch arguments, which, I wouldn't really call an API. I'd simply call it a command-line interface. EDIT2:- Example:- Youtube API provides the length of the video, if I were to download said video I'd know the length, but, at that point I'd stop calling it an API and start simply calling it a downloader & metadata reader. EDIT3:- Or maybe you mean plugins for BTSync? I can understand why I wouldn't need the commands if it was simply a plugin, something that's run along side BTSync.
  5. Few more:- A. Get file modification, access and creation date (Assuming that the filesystem supports this) B. Get size (Real size, not size on disk as that'd change through nodes) C. Get hash of file (MD5/SHA1/SHA2/CRC/etc...) D. Get hash of certain 4MB parts of files (That bittorrentsync uses) E. Access files from inside rar/7zip/tar/gzip/zip archives and treat them as if they were any other file. However, I'm starting to think what real use an API would give, and, I can't think of much.
  6. Offtopic, but, relevant XKCD:- http://xkcd.com/1179/
  7. In fact, I have a question for that. What happens if I add like 8MB to the end of the file, so no bytes are shifted over. Would it still resync the whole file or just the last 8MB that updated?
  8. Or, you know, just release the API protocol specifications and then every programming language is supported? Excluding javascript because of the same origin policy (https://en.wikipedia.org/wiki/Same_origin_policy), however, you could use PHP & javascript to overcome that limit.
  9. Another few things:- A. Ability to do basic tests on files without having to download them all (have the nodes in the system who do have it run the commands). Stuff like egrep/head/tail/sed/tr/wc, just all the basic linux I/O commands. B. Ability to edit small amounts of the file without ever having the file, say I know I want to edit bytes 10,000 through 12,500 with "DO A BARREL ROLL!" 147.058823529 times, then, I don't really want to download the full file (Which may be insanely large). You'd know where to edit using the above linux I/O commands. C. Ability to select "Only upload X many copies" and let the other nodes deal with sharing it within the network. Useful for third party companies who want to preserve bandwidth, once they upload one copy, they no longer want to seed it for their customers, the customer's PCs can then seed between themselves.
  10. 1. Dedicates API keys that you can generate unlimited of & give certain permissions. For example, if site A wants to use my share, but, I'm not sure if I trust them, they say they just want to dump log files of my VPS onto it, I can select "Only allow writing of files, not reading". 2. Be able to retract API keys, say I cancel my VPS on the above site, I no longer want them to able to write to my share. I want to removed said API key from being used 3. Obviously, ability to read files, write files, make folders, delete folders, delete files, move files, rename files, all of the standard filesystem commands. Not sure if it's possible, but, maybe make symbolic links and the rest of the more 'advanced' filesystem options? Not sure how well that'll work though. 4. Limit IP (ranges) to API keys, if the above VPS site gets hacked, I want only their servers to be able to do stuff. Although, I have a question, how will the API work? Will it be like a web-API where you connect to one of your servers and then that deals with asking the nodes? Or will you just contact any node in the system and ask them to talk to the other nodes? Or will you pretend to be a node? I've never dealt (or heard of) any P2P APIs, so, I'm having an issue trying to understand how it'll work. If you provide this, I'll probably be able to think up a few more, I'm just confused in which "in point" it'll come in from.
  11. Personally, I use lastpass and let it generate passwords for me. Although I do concur it means that someone can brute force one of my passwords and gain everything, and if lastpass were to go rough then I'd be screwed, but, I'm too stupid to remember 80 different passwords, so, if I didn't do this then if you brute forced one of my passwords anywhere then you'd have all of my passwords. I feel as though this is a better solution. FYI:- I used /dev/urandom and tr to generate my secret key, although, I have read somewhere that urandom isn't meant to be secure/what not.
  12. That all seems like way to much work, mounting and updating and umounting* and remounting. I'm just going to keep my files on a internal "my servers only" share. *I believe bitsync only syncs files that aren't in use, either bitsync, plex or both. I can't remember.
  13. Is there no option to blacklist certain processes from making notifications? I'd personally just enable that until all files are done syncing, then disable it so that I can continue to receive notifications.
  14. First of all, may I ask what you like about XBMC that plex doesn't have? Second of all, XBMC (I believe) has a plex addon that allows you to use your plex server as a library. Third of all, I have sickbeard with Periscope (Subtitle downloader) working fine. I had to modify a few configuration files of sickbeard to get it to run as an extra script, but, it all works. Anyway, as for the stats:- Network speed Ubuntu side:- Invalid data, you need the windows PC to connect to do the test. Network speed Windows side:- Are you sure you're connecting to the right IP? Can you ping said IP? Is Iperf running on Ubuntu when you execute it on windows? Write speed (Ubuntu):- Not great, but more than enough for bitsync. This more than likely isn't the issue. Read speed (Windows):- N/A Internet speed is 50/50:- I assume this is your down/up link to your ISP? If so, useless data. All data is transmitted inside your closed LAN network (I assume this is a LAN network? Else you need to port forward port 5001 on the NAT device that the Ubuntu device is on for the iperf test) , no data ever leaves your LAN area (well, it does, but, it doesn't throttle your down speed).
  15. This can be done via IPTables, any half-decent router (Even for internal requests) or the sync.conf (Although, a lot less advanced).
  16. Not really, since the file will have to be open for you to write to it and thus any rouge VPS host could easily access your files. The only way it'd work is if you put it in a true crypt volume pre-sending it.
  17. You can select the folder to be an external device? There's no issue with that one, the only issue is that you can't state two folders to sync the same data on one computer (without the use of something like sandboxie), E.G. internal and external. Also, I wouldn't recommend dumping files on a flash drive, they have limited amounts of write space and (from my experience) wear down fast. EDIT:- Just thought I'd state, I've only used the linux version. Yet to touch the windows version because:- A. I only run a single 128GB SSD on my windows box, I don't really have much to sync, if you get my drift. B. My windows box is connected to my linux box which has 15TB worth of space (Which is running BTSync) In the windows version (Due to how it mounts), it very well might not work. I have no idea.
  18. I do see your issue, possibly storing the data else where in a 'meta-data' part of bit sync? Not part of the file itself? That's my best bet, I'm not too good with ideas... or linux. This also would be an acceptable work-around, I'm currently having to run BTSync as some obscure user who has a really weird configuration (To allow it access to my local files, but, also allow it's files to be read by anyone but only executed and written to by others/etc) on my laptop so that all users can access the files.
  19. I still wouldn't sign up to any 3rd party (VPS/Dedicated servers/'BTSync servers') until encrypted nodes are implemented, if encrypted nodes are implemented. Just my point of view. Anyway:- This VPS product is only allowed to run programs intended to store or assist in the backup of Subscriber's data. Anyone found running programs not intended to store or assist in backup will be suspended and asked to cease, if they fail to, termination will follow. Seems insanely strict, assuming that you're a paying customer.
  20. Although, apparently the encryption private key is a fork of the secret, so, if you can access the configuration files and read them (Or inject into memory and read the process's data, possibly, depending on how it stores the data), then you instantly can once again read the transmitted data.
  21. There really should be a link around here:- http://labs.bittorrent.com/experiments/sync.html
  22. Basically, I have a file that I want to backup daily, it's 8GB on the dot and will never change (and if it does, then I'm happy to wait a bit of extra time because for said file to change requires a hell of a reason and won't happen often), I want to back it up once a day for at-least 7 days before starting to overwrite it, so, I have a command that backs it up to:- When backup is made:- myTotallyAwesomeBackup1. After day one it's renamed to:- myTotallyAwesomeBackup2 After day two it's renamed to:- myTotallyAwesomeBackup3 [...] After day six it's renamed to:- myTotallyAwesomeBackup7 After day seven it's deleted. However, I have a few questions about this and ease-of-transport. The internet of the back-up machine isn't the best (Under 1MB/s) so I'd prefer not to be pulling 8GB files down per day. So, my first idea was to add a compression part to the script, however, once I attempted to add it in I noticed how different the file is compared to how different the original file is. I'm not sure if it's a simple data "Shift" or if the whole compression really is different (never looked into how compression works too be honest). As shown below is two files from two different days before/after syncing with their size and changed bytes:- Unencrypted:- 7.5G CompressTestOne 7.5G CompressTestTwo 40199404 bytes (38MB) Encrypted (Only tested up until test two EOF'd, the excess data was excluded from tests):- 779M CompressTestOne.gzip 748M CompressTestTwo.gzip 781130671 bytes (744 MB) Basically, my question is, is there anyway to compress a file without changing the whole file when it changes daily? Although, once bittorrentsync supports compression during transmission then this is all pointless.
  23. Work around (Although not perfect) is to add this to a crontab for once a day:- find "$share/.SyncTrash/" -mtime +$days -delete find "$share/.SyncTrash/" -empty -delete Although, I do agree it's not the best way of doing it. Personally, I rarely delete stuff so I've yet to set it up for BTSync, however, I do use the above script to delete files I download off RSS feeds on usenet indexers and it works fine.