There are several tools to check for memory leaks but most of them target the experienced developers. Just by looking at the task manager it is impossible to find memory leaks. We can relatively easy spot memory leaks with the debug version of SmartFTP but to do this we need a reproducible case. The fact that 139k items use 900MB of memory is no reason to believe that there is a memory leak.
OK, what do you suggeset we do to make this program not eat up 3GB or RAM, and max out my CPU?
Let's say it takes 900mb to load a 138MB file containg 139K items. How can the memory usage grow 300% more (over 3GB) when the total files in the folder are only 182k (or 32% more).
Also, is it necessary to show the user that many files in the listview? When I created a program with a listview that looked like SmartFTP, it used added about 14 MB for every 10k items in the list (around 200MB for 140k files) of RAM utilization. If there were a setting to limit the amount of items to display in the listview it could save a decent amount of memory. When I watch the memory util of smartFTP, I see the memory jump when a large number of items are added, but when the item count drops considerably, the memory usage drops about 10% of the increase. So if the queue jumps from 1k to 8k items and it uses 60MB more RAM and then the queue drops back down to 1k, i'm only seeing like 6MB freed. My assumption is that the times I do see the memory usage go down it's because momory is being freed by the listview being smaller, however whatever arrays/collections/objects used by the program are not decreasing in size or being destroyed, or maybe it's building a string or XML file in memory for a log and not writing it to disk to free memory.
EDIT: I just now saw memory drop about 200MB so maybe it does actually flush some stuff out very seldomly... This was around the time that quite a few differences in the FTP site and the local site were found and it actually started uploading stuff.... Maybe there are some checks to free up some memory that only fires when a file is uploaded as apposed to checking hashes and not uploading since they are the same? There still has to be a way to work with a very large queue w/o using 3GB or RAM.