Quantcast
Channel: 2BrightSparks
Viewing all articles
Browse latest Browse all 9303

#1 FoC Feature Request - Please do this before ANYTHING!

$
0
0
Rehashing files that haven't changed is making the program almost unusable at times.

I just ran a search on 1.69 million files, 5 TB, that took 200,000 seconds (55 hours) to compute the hashes.

Over the next few days I'll be purging duplicate files, but won't be changing many, if any, files so the hashes that have been computed are all going to remain valid.

But as soon as I need to run another search, I will have to wait another 2.3 days for the hashes to be computed.

This is not only slow, but heats up my CPU and drives, plus the waste heat is annoying!

PLEASE find a way to log the date/timestamp of the hash and only recompute it if changed!

In Ticket # 5713-5006022313 (in 2010) I suggested using either a config file that stores the date/timestamp and hash, or write that info as a secondary stream to each of the files.

There has got to be a better way than recomputing the has each and every time!

Please make it a priority to explore options for addressing this situation.
FoC 2.3 day search.PNG

Viewing all articles
Browse latest Browse all 9303

Trending Articles