Jump to content


download problems

  • Please log in to reply
1 reply to this topic

#1 drasey

  • Members
  • 2 posts
  • Location:
    San Diego

Posted 13 February 2003 - 04:06 AM

I'm having problems with large (300+mb) downloads that aren't coming in properly. I've been checking these downloads using MD5 (comparing the sum generated by whomever I download from), and they generally fail.
Last night, I tried using command line FTP, and the download that I had just tried a few minutes previously (which failed the MD5), came in fine.
I used a program called ncftp.

I'm not sure if this is new 'problem' (though I've noticed it over the last fews months), and it appears to happen with both the current version (1.0.973), as well as the previous release.

I've been adding the download to my Global Queue.
I've got 'On File Exist' for downloads set to resume.
My resume rollback is set to 1024 (was smaller, same problems).
My I/O buffer size is 384 (was 64, same problem).
I've got Ascii/Binary set to Auto.

This doesn't seem to be server dependant (though I'll be checking if the servers are running the same software/version).

Does anyone have any advice on what I can do to further troubleshoot this problem??


#2 drasey

  • Members
  • 2 posts
  • Location:
    San Diego

Posted 02 March 2003 - 11:45 PM

Just wanted to answer my own question!

It seems my download problem might have been due to some bad RAM. Looks like downloading with SmartFTP was one of the 1st indications of a problem, other symptoms also started to manifest themselves.

I finally tracked it down to at least one bad stick (maybe 2, I replaced all of them).

I have download about 6 different, large files (600-700mb), and have not had a single problem with those.

In case anyone cared!


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users