Author |
Message |
NickKearns
Joined: 12 Sep 2006 Posts: 2
|
Posted: Tue Sep 12, 2006 11:05 am Post subject: Out of memory, Too many Files |
|
|
Hi
i get this error after 3,669,999 files scanned
i have one folder that has 18,500 folders in and in each folder there are two more folders with images in
Total images 6.2 Mil
Total folders over 55K
each day between 10 - 20 new folders are added and 5K - 20K of images
will this app be able to deal with is
Regards
Nick |
|
Back to top |
|
|
TGRMN Software Site Admin
Joined: 10 Jan 2005 Posts: 8759
|
|
Back to top |
|
|
NickKearns
Joined: 12 Sep 2006 Posts: 2
|
Posted: Tue Sep 12, 2006 11:25 am Post subject: |
|
|
Dont think that will be possiable due to the setup of folders
example
D:\LiveImages\_lotimages\ has the 18,500 folders
each folder has a number related to a client id
like
1138826
1138835
1145255
in each folder there is a
Full and Thumb folder
then inside each folder will be the images
how will i tell the app to select say the first 6000 then the next profile to get the next 6000 and so on
Regards
Nick |
|
Back to top |
|
|
TGRMN Software Site Admin
Joined: 10 Jan 2005 Posts: 8759
|
Posted: Wed Sep 13, 2006 10:54 am Post subject: |
|
|
Quote: |
how will i tell the app to select say the first 6000 then the next profile to get the next 6000 and so on
|
This is not possible. Somehow you need to be able to organize profiles and limit the amount of files that will be scanned in each one .... _________________ --
TGRMN Software Support
http://www.tgrmn.com
http://www.compareandmerge.com |
|
Back to top |
|
|
Jon Guest
|
Posted: Mon Oct 09, 2006 11:14 pm Post subject: Same problem |
|
|
Hello,
We also ran into this problem when evaluating ViceVersa Pro. It's really too bad because everything else seems to work well.
Unfortunately, we simply cannot split up the directories in any manageable way. We have large amounts of data being added, removed, and manipulated very frequently. For us to have any hope of running an automated backup, we need to be able to specify near-root-level directories, under which there may be tens of millions of files.
So now I am off to search for a stable solution that will handle:
- incremental backup (mirroring)
- long paths
- large numbers of files
Any suggestions?
Here's a suggestion for you: Maybe you should look into storing the comparison data in a database to get past this limitation. If your code is nicely modular, I can't imagine this taking more than a few days to implement...longer to test and debug of course, but the benefit would be so great.
Hell, call it ViceVersa Enterprise and charge an extra $50. I'd buy it.
-Jon |
|
Back to top |
|
|
|