Scheduler for duplicated files

Hi All,

I am now using smartftp to connect to my sftp server to fetch reports from the server to my pc. The built-in scheduler automates the downloading of reports to my pc. However, there's one issue that I'd like to raise in hopes that some of you can help me resolve.

My sftp server in structured in such a way that once a file has been downloaded, the original file would stay on the server.

For example:
File A, B is on server

The client downloads file A from server to local PC. A copy of file A is copied to PC.
However,file A stays on the server but cannot be downloaded again.

The built-in scheduler would then try to download the same file (file A) repeatedly resulting in a download loop. However, due to the inherent behavior of the server, the file cannot be downloaded again even though it stays on the server.

Is there a way I can deine the scheduler such that it won't try to auto-download files that have been tranferred previously?

Thanks in advance!

The scheduler is stateless and it also doesn't know the details of the actions. You need to try to find another ways:
- Move the file on the server to another folder after it has been transferred
- Write a script for the transfer queue item which adds all files that have been transferred to database and if the same file is about to be transferred again, skip it

Thanks for the reply, mb. Moving the file on the server is not possible as it requires quite a big enhancement on the IT side. However, your second suggestion seems a lot more feasible, but I'm no script writer. Could you give me some guidance on how I can write a script for the transfer queue item which adds files to database and if same file then skip it? I'll need some sort of sample script or guide that I can read up and how exactly do I run this script on smartftp?

Many thanks for your help.

Scripting is provided as is and not included in the support plan. We are offering custom development (charged by the hour at the standard industry rate).