SmartFTP crashing while listing folder with >60000 files

Hi mb,

I'm using the last version of SmartFTP to make regular backup synchronizations of a website (using XCRC, XMD5, ...) using Transfer Queue. I've some folders with several tens of thousands of images. One folder has 20000 files and everything works fine. The problem is another folder with >60000 files. Even the listing of its content takes sometime (which is normal) I've ajusted the connection timeout time and now SmartFTP gets the listing without disconnecting, however it crashes while sorting all the files (the list is more or less 4MB big [uncompressed]).

Can you please check this bug out?

I'm also having problems with the connection reusing thing but you already told us it will be fixed in the next beta.

Added hard limit for sorting of items (10'000). No setting to change the limit yet.
https://www.smartftp.com/download

The Quick Sort we are using causes a stack overflow with a large data set.

Regards,
-Mat

Thanks for your quick reply and action mb.

What does that mean? For now SmartFTP won't be able to manage folder with more than 10000 files or it will sort folders with more than 10000 files in several steps of 10000 files at a time. Sorting items is a condition for SmartFTP to be able to download files or it is just an interface/GUI thing?

Hello ..

It won't sort a directory listing with more than 10'000 files anymore. The items are usually sorted according to the priority list and other factors before they are added to the transfer queue.
-Mat


Hello ..

It won't sort a directory listing with more than 10'000 files anymore. The items are usually sorted according to the priority list and other factors before they are added to the transfer queue.
-Mat

Do you plan to change/fix the Quick Sort anytime soon?

No. I think the the current solution is good.

Are you a licensed customer?

-Mat

Yes, I am licensed customer.

I would appreciate if you could fix this limitation because I'm using Gene FTP Server + SmartFTP Client to synchronize backup versions of several big websites.

By the way, the stack overflow with large data sets you mentioned doesn't also occur if (despite the hard limit you applied for folders with >10000 items) you have in the queue several folders with <10000 items each and the total number of items in the Transfer Queue reach i.e. 60000 items? I'm almost sure that I got a stack overflow again through this way... (CPU load reached 100% with nothing being transfered anymore and Transfer Queue with >60000 items)

Regards.

Please add your license key number to your forum profile.

As explained before the limitation is not really affecting anything. A directory with more than 10'000 items is just not sorted anymore. Because it takes a) way to long and a stack overflow may occur. All files are added to the Queue and are transferred nevertheless. I tested the Transfer Queue with more than 60'000 items without any problems.

Regards,
SmartFTP

Sorry mb,

I misunderstood what you have told me previously. Now I got it.

I'll make a new test, because I'm sure that something is still crashing SmartFTP during the synchronization I'm doing. Maybe it's caused by Windows Server 2003 R2 Standard, Remote Desktop usage, multiple downloading threads usage... I don't know but I'll try to figure it out and let you know. When SmartFTP crashes, I don't get any error message however I noticed that SmartFTP generates minidump files with 0 bytes.

Regards, and sorry again for my misunderstanding.

Hi again mb,

I've already tried several combinations and I've concluded that using more than 1 thread makes the problem worse because if I have several subfolders with some tens of thousands of files each, when the files end you also have more than 1 thread opening 1 subfolder each which results (in my case) in almost 90000 files in the Transfer Queue.

I don't know if I already told you but I'm using XCRC, XMD5 and XSHA1 compare for existing files in the Queue and transfer integrity check.

Today I've tried using only 1 thread (which makes this operation very slow) but although the I can get the work done, it takes more than 18 hours when it could be 3 hours or even less (I've really high speed Internet connections). The problem here is that for some reason the CPU load is always at 100% and SmartFTP is using 170MB of RAM and other 170MB of VM for 60000 files in the Transfer Queue. Also the queue.dat file size is almost 2.7MB.

If you want I think I may send you my queue.dat and you will see for yourself what is going on...

Thanks again mb!

And what exactly is the problem?

The resource usage is pretty normal if you deal with so many items in the Transfer Queue.

Regards,
SmartFTP

Problems:
1. It's not possible to use more than 1 thread because the transfer is stopped MUCH more time processing something than the time it is transfering files.
2. Even if you use only 1 thread, transfer stops during 5min for each lot of 100~200 files it transfers (which takes 70 seconds).
3. Queue list saving, queue list pause or stop take around 10~15min to complete

Regards

>1. It's not possible to use more than 1 thread because the transfer is stopped MUCH more time processing something than the time it is transfering files.
I didn't notice such problems. Provide more details if this still happens with the new version. I was testing with 60k items and 8 threads. Larger number of threads do not necessary improve performance.

>2. Even if you use only 1 thread, transfer stops during 5min for each lot of 100~200 files it transfers (which takes 70 seconds).
The autosave feature saves the queue every 5 minutes. Saving of queue significantly improved in new version. It takes 6 seconds for 60k items on my 2GHz P4M now.

>3. Queue list saving, queue list pause or stop take around 10~15min to complete.
Saving improved (see above). Start should be immediately without any delays.

Latest version:
https://www.smartftp.com/download

Regards,
-Mat

Mat,

Congratulations! You made a MASSIVE improvement over the previous version!

As I told you before, with the previous version a complete synchronization took over 18 hours to complete (using 1 thread). With the last version I could use 6 threads and the complete synchronization took 2~3 hours! I think SmartFTP didn't stop even once since it started the synchronization! I've also noticed that, as you wrote in your reply to me, now the auto save function is working perfectly. Before if killed SmartFTP I had to start the synchronization from the beginning because SmartFTP couldn't auto save the transfer queue in time.

Well, now I spotted another problem. I'm using SmartFTP through Remote Desktop and since the moment I started the synchronization and minimized SmartFTP (regular minimize, not minimize to systray) I could never again restore SmartFTP even after the synchronization had ended. The queue.dat file was again 1KB big (I have a recurrence in the transfer queue), I had no window opened inside SmartFTP and it was loading CPU around 50% and was also using 60MB of RAM. The only thing I could see from SmartFTP window was the window title and the 3 window's buttons in the right side of the title.

I don't know if you have already noticed but when the transfer queue is in use (specially with multiple threads) its render through Remote Desktop loads heavilly the CPU and RAM, that's the reason why I've minimized SmartFTP. If I hide the transfer queue the excessive CPU and RAM load also goes away, but to minimize SmartFTP window is easier than only hide transfer queue.

I've made the new problem bold to be easier for you to spot it.

Once again, congratulations for the great improvements you made.

I'm glad it works better for you now.

I tried to reproduce the problem but with no success. I'm connecting to a Windows 2003 Terminal Server and the CPU usage is the same as when run locally. If the application window is minimized it will always use less CPU regardless whether it's showed in a remote terminal or locally.
Are you able to reproduce this problem with less items and less threads as well? It would help if you can figure out what exactly triggers it.

Thanks
-Mat
SmartFTP

Hi Mat,

SmartFTP crashed once and I've sent a crash report with my e-mail and customer number. Maybe the crash report has some data that can help you.

Finally I could figure out exactly what is causing my problem. The problem ocurres everytime I disconnect and reconnect remote desktop connection with SmartFTP running in the remote computer.

Please follow these steps to reproduce the problem:
- open SmartFTP through remote desktop (you can leave it idle or not; it makes no difference if you minimize SmartFTP window or not);
- disconnect from remote desktop;
- reconnect to remote desktop;
- try to restore (or make any other thing with) SmartFTP window.

Regards and have a good weekend.

I was just able to reproduce the problem with the TS as well. Do you have the URL Watcher disabled? If yes, try to enable it and try again.

Regarding the crash: Please send the mini dump which is saved in the SmartFTP application directory/MiniDump to sales attt smartftp.com.

Thank you.

Regards,
-Mat

I guess you made a typo, right? But yes, you found it...

If have URL Watcher enabled the problem does occur. However having it disabled while using TS disconnect/reconnect prevents the loop which is causing the window not be rendered properly after a TS disconnect.

Well done once again!

Hm interesting. It's just the other way around here. I will find a solution till Monday.

Regards,
-Mat

Bug should be fixed in the latest version:
https://www.smartftp.com/download

The problem was that our application didn't correctly unregister from the clipboard chain.

Thanks for the reports and the testing ;-)

Regards,
-Mat

Hi Mat,

I've been using that beta during these days I everything works as expected now!

I'm sending that minidump to the e-mail you provided.


Regards and thanks.

You won't believe this Mat...

I've installed the latest beta version and the problem with URL Watcher + Remote Desktop disconnect and reconnect is back... This time the way you described it! If I have URL Watcher disabled I get that loop if I disconnect and reconnect Remote Desktop.


Edit #1:

In fact I think there is another problem. I was using the beta version you fixed for me until some hours ago I upgrade to the latest beta with the fix in the Compression Exceptions. The problem with the URL Watcher + Remote Desktop disappeared after I kill the looping SmartFTP PID, reopen SmartFTP + Enable URL Watcher + close SmartFTP, reopen SmartFTP + Disable URL Watcher + close SmartFTP. After this procedure I've had no more loops. So it might be a Var+Value problem missing in Registry or in a SmartFTP config file...

The problem seems to occur only when you are upgrading your version. If you run the setup with the same version you have already installed it does not occur.

The buggy version broke the clipboard chain. Therefore you have logout (not disconnect) from your terminal session for it to be fixed first. Then you can use the new version without any problems.

Regards,
-Mat