We are uploading several million small (~20KB) files. When using the 64-bit version of SmartFTP, the queue will grow to 700K - 900K files and consume up to 18GB of memory. Once the 8GB of physical memory is exceeded, SmartFTP slows down from 60,000 files per hour to less than 200 files per hour as the machine spends nearly 100% of its time swapping memory with the page file. Is there a setting to limit the queue size that I haven't found yet? Is there some other setting that will improve this situation?
Unfortunately the queue has not been designed to handle several millions of active items. Is there a way that you can limit the total number of active items in the queue? E.g. by splitting up folders with a huge number of items into multiple folders with less items?
I'll look at splitting things up. Would changing to the 32-bit version improve this because the limited memory address range would be less than the 8GB of physical memory installed on the machine?
I think it would make things worse with the 32-bit version because SmartFTP will run out of memory and then automatically terminates.