Move Complete Files/Prevent Downloading Again

Alright, I'm not even sure if this is possible, but might as well ask.

What I want to do is(in my scheduled downloads) have files download into a "dropzone" and then automatically move into a different folder upon completion. From what I gather there is a script that can do this(though so far I've failed to make it work, probably did the arguments wrong), however I also wonder: is SmartFTP intelligent enough to recognize that it has in fact already downloaded the file/folder? I ask because I realize that if I download then move the files, once it comes around for the next check it will download them AGAIN, slowing down all subsequent downloads.

If preventing it from downloading again isn't an option, how can I set it up to do the following:

Download-->Move-->Copy Back To Download location (to prevent it downloading from the server again)

My final question is, how fast is a move action? Is it instantaneous like normal Windows move actions or is there a delay?(I will be moving folders roughly 10-40GB within the same physical volume) Does SmartFTP actually handle it itself or is it performed by the system?

Sorry for the pile of questions and thanks for any suggestions.

Yes you can use the post transfer queue item script to move the file to its final destination.

If you move the file away from the destination, SmartFTP will download the file again once the folder is processed again.

A move operation on the same drive is instantaneous. Your proposed workaround to use a copy operation instead of a move in the post transfer script will work.


Thanks for the quick response, how would I modify the post transfer move script? I want it to move and then copy back to the original download location so it's then mirrored in two locations on my machine.

I would think you would change the MoveFile function with a CopyFile function?

If I just wanted a copy, but I want move and THEN copy how could I go about that? I was never terribly good at any sort of language

1. Install the SDK
2. I assume you have already setup a recurrent download of a folder.
3. Right-click on the transfer queue item, go to the Script dialog. Select the "ApplyToFiles.js" script from the SDK\Samples\TransferQueueItem\JScript\ folder. In the arguments input box enter: "<path> to MoveLocalDestinationFileToFolder.js" and as the 2nd argument enter the final destination folder.
E.g. "C:\...\etc\SDK\Samples\TransferQueueItem\JScript\MoveLocalDestinationFileToFolder.js" "C:\temp"

Now since you want to copy the files instead of moving them to the destination folder, open SDK\Samples\TransferQueueItem\JScript\MoveLocalDestinationFileToFolder.js and change the MoveFile(...) function to a CopyFile(...) function. See the notes in the script.

How does it work?
- The ApplyToFiles.js script adds a script to each file that is being added to the transfer queue from this particular folder
- The first argument is the script that is being added and the 2nd argument the argument for this script
- Now the MoveLocalDestinationFileToFolder.js script is run after a file has been downloaded. It's function is straight forward, it just moves/copies the local file to its final destination folder.

Alright, so for the copy back into the original download location, how do I access the original full path of the file(on my local machine)?

The logic:
File Downloads from remote to D:\Dropzone\Pictures
File is moved to D:\Photography
File is copied from D:\Photography back to D:\Dropzone\Pictures

The same file now exists in both locations....but, how do I get a reference to D:\Dropzone\Pictures? I noticed "TransferQueueOperation.TransferQueueItem.Destination.Path", but that only seems to be the relative path since you append it to the variable destinationFile, or am I mistaken about that?

I thought this is what you want to do:
File Downloads (Copy Operation) from remote to D:\Dropzone\Pictures
Copy File from D:\Dropzone\Pictures to D:\Dropzone

No, sorry if I explained badly. I want the dropzone to be just that, a place where only in progress transfers are, and have them move to a different location (call it Processing) when complete. I only want to create a duplicate into Processing to prevent them from downloading again. It needs to be a move operation out of Dropzone initially as the software I am using to process doesn't poll well(possibility of damaged files) and so only reacts to files arriving via a move action. The duplication creation can be a copy as it only needs to be faster than the movement down from the FTP server.

Sorry if it's a little unclear, it is a strange setup but short of polling the folder and doing parity checks I really don't know a better way to automate the process without risking file damage.

Then you would exactly use what I have proposed.
- Download everything to a temporary folder. e.g c:\processing
- Once each file is completed copy it to the final folder e.g. c:\dropzone

Alright, but would that not leave SmartFTP downloading the files again on the next recurrence of the schedule? The move to processing HAS to be a move not a copy

Yes the file won't be downloaded again because the file is being copied and not moved. This is the reason why you have to change the MoveFile to CopyFile in the MoveLocalDestinationFileToFolder.js script.

I however CAN'T do that, it NEEDS to be a move into the processing folder, a copy is too slow.

How big are your files? Copying a 1GB file on a modern computer shouldn't take more than 30 seconds.

10Gb on the small end, some of these folders and files will be stretching to 100GB or more. Some of what I'm going to be handling is uncompressed raw video footage so the sizes can get fairly insane fairly quickly.

In this case you need a more sophisticated script which looks into your final drop folder and if the file already exists there it will tell SmartFTP to skip it. If you are interested in custom development (charged by the hour) please contact our sales department. Thanks.

Not something I'm interested in, just a copy back is fine, I have lot's of space, having a few files mirrored for a few days isn't an issue.

What kind of arguments does CopyFile takes? Does it just take two strings? So would: CopyFile(D:\processing\Mypic.NEF,D:\Dropzone) send the file back?