Quantcast
Channel: 2BrightSparks
Viewing all articles
Browse latest Browse all 9303

Re: Avoid perpetual FTP re-scanning on every option change

$
0
0
Conrad Chung wrote:As per the following Knowledge Base article -http://support.2brightsparks.com/knowledgebase/articles/215257-syncbackse-pro-scans-with-fast-backups

the reason we need to rescan the the Destination is to stop/detect moving of the goalposts (that is, to make sure all the target files are where they belong)

In a real-world scenario I want to update a website on the FTP (as I mentioned). I keep a local dev copy with some files/folders that need to be excluded etc.
The FTP is touched only by me. The FTP is only updated by me (SB). I really don't care what's on the FTP now, the FTP is in the state I left it last time, it is not manipulated by nobody else.
As per your example with the moving target file:
If I move a file locally to another folder, SB could know that in two steps -
1) the file in the original position is missing (deleted), so it will be either deleted from FTP in that position (if deletions are allowed) or ignored/skipped.
2) The file will be noted in the new position(folder) as new and uploaded to the same new position on the FTP.

Clean and straightforward.
All I'd want is for SB to keep locally the list of all remote FTP's folders/files and "trusts" this list as I will give it the right to "trust" it (because I make no modifications to the FTP from elsewhere), so it won't rescan the remote FTP constantly, because it takes very long time. It will just update and keep the local fastbackup data syncronized after every modification to the FTP it does.
I'm sure many use cases will benefit, even in non-ftp networks.

E.g. the concrete "sync" part in the SyncBack's title.
Everything is about avoiding the extremely slow rescanning of the FTP when I know it is safe to just upload/delete files there.
Whether a file is there in a folder or not, it will be uploaded/replaced, based on the Source modifications/additions etc.

Conrad Chung wrote:Thank you for your suggestions. However, I foresee a problem in a situation where somebody copied a file to the Source today which has a LastModified date/time stamp of last week, but he has already last ran the profile yesterday... In such a situation, it would ignore that file forever (never backing it up).

Some other softwares already can do this (but cannot do other things SB can). You can scan the Source for new files anyway, and include them as well, and whether they are new or not (e.g. existing), can be detected by comparing the trees/files with the fast backup data we already have. Using the file timestamp could be just another "OR" factor in the decision.
But I proposed this feature only because of the annoying constant FTP scanning process. If this is somehow resolved, all this with the "last run datetime" could be dropped.

Viewing all articles
Browse latest Browse all 9303

Trending Articles