Transfer all files in a directory... Scheduled!

I have an FTP location that will continually be adding new files. I need to be able to transfer whatever files are in that directory every 10 minutes and the delete or move anything that has already been transferred so I do not download any duplicates.

I know how to schedule transfers for specific file names but do not know how to transfer anything in the directory at a given time. PLEASE HELP...

Is this possible with SmartFTP?

Great. That looks like it will work. I will try that out today.

Is there a way to delete all files from the FTP location once they have been transferred? Or to move them to an archive folder so I don't transfer them again on the next schedule?

Yes. The easiest way is to change the operation in the transfer queue properties from Copy to Move. Then enable the "Keep Folder" option in the Advanced dialog.

If you want to move the files to a different folder (instead of deleting them) you need to use a post transfer script. Take a look at the SmartFTP SDK:
This is an advanced topic and you need to have a certain background with software development.

I have downloaded the SDK and followed the steps on Knowledge Base post. ( ... f2634.html)

In the post is talks about moving a file on the local computer "C:\". I am unsure as to what to put into the argument field. I am transferring files from a remote server to my local server. I want to archive what is on the remote server. I have copied the path of the remote server and added a folder at the end of the path /downloaded.

When I process the action, it copies the file to my server and then tries to move the file but can't do it. It continues to say retry in the status column. I think I am doing something wrong with the Path for the move folder but don't see why.

Any suggestions? I would like to archive rather than delete.

Thanks for all of your help. This has been very useful!

The SDK/Scripting is an advanced topic and unfortunately we cannot provide support for it.

Ok. I have figured it out. Whewww.

Maybe you can help me with this one last thing. I will tell you how I have it set up and you can tell me if there is a better way.

I am trying to transfer all files that are .TXT in a folder on a remote server to a folder my local server every 10 min. The way that I have it set up is to transfer the the entire folder to my local server and apply a filter that says only *.TXT. Then I have smart FTP move that entire folder up a level into another "archive" folder. So it works.

However, I do not want to copy the entire folder to my local server and I do not want to move the folder into another archive folder. I DO want to move the files within the folder.

Is there a way to do that?

In theory you do as following:
- There is an event after the folder has been enumerated (see transfer queue scripts)
- In this event, enumerate all transfer queue items, if the the item is a file and matches *.txt add a script to the transfer queue item
- In the transfer queue item script, in the end event, create a new transfer queue item which moves the file on the remote server to another folder. Add this new transfer queue item to the queue
- So first the file will be downloaded, then the script is executed which adds another item to the queue to move the file.

Unfortunately we cannot do custom development for free.

Ok. That is understandable. I have made a work around. I will simply move the files from the remote server to the local server with SmartFTP and use a local VBS file to archive and move the files around as needed.

One thought I was having earlier was what happens to the files that are being written to the folder that is being transferred at the same time? Are they left alone or will I be getting errors or what? Have you heard of this issue before?

Please add the license key id to the profile for technical support. Thank you.

I'm quite interested in's solution. Any way to send a private message to another user here?