AetherMichael

Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by AetherMichael

  1. What the crap were you thinking???? This project was supposed to be improvements for the community over dropbox and others with their 2gb limits or folder caps, Now YOU are doing a 10 folder limit on the free version? Good Luck.
  2. Let me see if I can describe my vision better. When you click the FORCE RECHECK button for examile, it brings up a box on screen with a left and right payne. It will systemattically check all the files in the folders to make sure they match,,, BUT when it finds changed or deleted or missing files, it will prompt me what I want to do. I can create a scenario like, Wifey comes in and goes thru family photos. She changes the names to include time and date stamps, then accidently deletes a whole vacation folder of family photos. I would like it to compare the folders and show me the differnce on screen, I can check the box and select to "RESTORE" those files she accidently deleted, and also just select the others she renamed correctly to update. I am looking for more control SO maybe... LEFT and RIGHT paynes to symbolize this share and the share on the other PC. and LEFT ARROW and RIGHT ARROW that will function as copy selected files OVER to the other share..... Ignore button to ignore this change for now and in the future scans.... See what I mean? Right now I ahve no control, I If I turn on the other PC I have, it iwll update the folder and sync up relentlessly with no input from me... Here is a demo example sorta. http://www.ultraedit.com/products/ultracompare.html <----- Watch video from 1:04 to 1:30 Here! for example! I dont anticipate it being this extravagant, But a basic version of it would be great! Help us all restore files that are deleted or messed up or corrupted!
  3. Ya its like I just said in the above post. The file was there. It was just a FILENAME.!ut Meaning it was only partially downloaded. Was an incomplete file. I didnt indicate anywhere in that post that the actual file is not being present in said share. I have no idea where you got that from at all. One of the issues I have is timing. The software itself has timing issues I Think. PC1 Gets a whole lot of new files in the folder. It then indexes and takes a while doing it. Once done, it begins distributing the files, PC2 Is downloading normally. PC2 needs a restart. Once booted back up, PC2 Begins INDEXING again. it then indexes and has some incomplete files that were originally named FILE1.ISO and FILE2.IMG but now are named FILE1.!ut and FILE2.!ut and so on in them. But the second pc, PC2 just indexes them as if they are supposed to be called that. The data is also incomplete in them. PC2 HAVING THE NEWER INDEX LIST then shares the broken files named FILE1.!ut and FILE2.!ut back to PC1, Corrupting the data completely and destroying the files forever. This happened multiple times to me already. My problem is that Bittorrent Sync is RENAMING files while it downloads them. For example, I have a file named Windows.ISO This file is 700mb. I start downloading it. The file is not named Windows.ISO, but instead Windows.!bt and put in the folder it needs to be in. The problem lies when the second PC is in the middle of the download. If it is shutdown. Restarted, then it comes back up and starts Re-Indexing. It will index the file named Windows.!bt and because its newer index list is newer than the first PC, It will send the Corrupted incomplete file back to the first PC and destroy the original as well. Meaning the entire file is lost forever from both PCs. The reason I suggested a cache folder, is because I personally want to start maintaining the actual file structure integrity. I am not ok with large files that are syncing being Renamed into sometihng with the WRONG file extension and run the risk of it being indexed that way by the second computer again. Cache folder will make sure that the data being downloaded with those bittorrent specific endings DO NOT END UP IN MY FILE STRUCTURE AND GET INDEXED by other PCs. That is the point. This seems to be only a big issue when syncing 670,000 files over 937GB of data. Syncing is a process that occours over the next week or so this way.
  4. Method 1 Ya, I have that already done, so what I do is have one Windows computer that is "Master share", it shares to my Raspberry Pi movie player, which is slaved by "Read Only" status. There is a box you can check under advanced that allows you to force the files to be the same even if deleted or renamed. It will correct it. So I copy stuff to the master drive only, which I only have access to, and the slave which is accessable by everyone stays synced perfectly. This keeps a redundant copy of my files also so that if one harddrive fails, I have a copy of everything on the other one and dont have to reobtain any missing data. It requires two shared locations or two computers though. ------------------------------------------------------------------------------------------------------------------------- Method 2 The other way to do what you want is under "FILE PERMISSIONS" in windows or linux or whatever you have it on. Set read only permission for the folder for all users except yourself. This way you have to type in your super secret squirrel passcode and identity in order to delete or copy to that folder. You can google this process for whatever Operating System you are using. This is prolly the best way to do it if you have only one huge harddrive you need secured and dont want to build a second copy somewhere for redundantcy. Suggested Google searches: Windows 7 SP1 File Permission Read Only <-------------- Windows 7 is the most common Windows OS used. Windows 8.1 File Permission Read Only <---------------- Newer Windows 8. Not very good for sharing. Fedora 21 Configuring File Permission <-------------- A linux based Operating System SAMBA Configuring File permission <---------------A lunux based Network File Sharing System
  5. After a share is completed, I would be interested in a button or rightclick menu option to do a MD5 integrity check on each file in the share automatically to make sure the hard drive or data is not corrupted, and that the data is exacting as it is copied. I am actually using this as a RAID across a 1TB harddrive and would like to make sure the one folder is exact compared to the other folder. I dont care how long it takes. that is not a factor. I have had corrupted torrents and aftera forced recheck, it drops the damaged package, and just redownloads a new packet to fill in the spot. It would be great to have that kind of file security on this system as well. This isnt just for the software, sometimes the hardware such as a harddrive or something goes bad and will not write data correctly to a particular sector or seg. I would like to be able to verify after that the data is exacting regardless of all else. Would it be too hard to integrate to make a VERIFY INTEGRITY button and make it MD5 Hash possibly files in the structure systematically?
  6. I would be interested in a feature that would allow me to set a download location maybe specifically or some "Default" folder in the OS to download files to.... then as the downloading files complete, it moves them to the folder structure in the place they should be. ISSUE #1 There are a couple issues being addressed here.... One is that "IF I see the file, its complete." I have had a couple instances where it was done syncing, and just hasnt reindexed yet, and had not completed each file download. Meaning I dont take my laptop to the office thinking its done syncing only to find out there is a file there, its just a MYFILENAME.!ut file and not yet complete functional file. I want to be able to, "IF I see the file there, its complete" ISSUE #2 The second issue is INDEXING. I have indexed huge file structures in the past, in the process one PC indexed files that were already *.!ut files as files that were not supposed to be indexed. These files were still in the process of being downloaded and were incomplete. I am unsure why, There are several fixes for this. For example, You can use only one index file located on one PC for all sharing PCs to copy from. This can help solve the problem, You can possibly just make a default exclusion for all software created files. For example, Make a rule to never index *.!ut and any other software fabricated file ending. I would not mind personally having a cache file that files just download into, so that the *.!ut files are not located inside the actual Indexable file structure. Keeping the created files separate from the Indexable files will guarantee that none of my files will be damaged in the future. due to incorrect indexing. ISSUE 3# Fragmentation - Just want the files to be one contiginous file on disk. That is all. (maybe just reserve space on disk ahead of time?) So to solve all issues with one stone, I think a cache folder would do the trick best! This will let me know for sure that if I see it there, its complete. and also, this will keep non-bittorrent related files out of my file structure preventing fatal data errors and losses, and finally, the move of one complete file into its folder can if done right handle the fragmentation issue as well. Mike!