(page requirements)

How to download large files faster and more reliably

Trying to download that 30M file for the third time? How many times have you started to pray thinking your dial-up connection might go down when the download is 99% complete? Well, here's a bit complicated, yet effective method that just might reduce your downloading headaches...

Let's say you're connect to the Internet using a 28.8K modem and that your Internet service provider lets you telnet to their server. Of course, your provider's server can download files much faster, so the idea here is to download the file to their server at high speed, split it, and retrieve parts at the highest speed you can to your provider's server. Generally speaking, this method not only is a faster way to download files, but it is also a more reliable method since you'll be downloading many smaller files rather than one large file.

First, find out the FTP address of the large (larger than 16M for example) file that you want to download. For example:

ftp://ftp.x.com/bigfile.zip

Telnet to your Internet service provider's server.

telnet x.com

From within the telnet session, download your file to your provider's server. Following is a "screen shout" of a sample file transfer on an UNIX compatible server. Blue/bold text represents your input.

ftp
ftp> open ftp.x.com
Connected to ftp.x.com.
220 ftp x.cpm FTP Service (Version 2.0).
User (ftp.x.com:(none)):
anonymous
331 Anonymous access allowed, send identity as
Password:
my_address@x.com
ftp> binary
200 Type set to I.
ftp> get bigfile.zip
200 PORT command successful.
150 Opening BINARY mode data connection for bigfile.zip
226 Transfer complete.
ftp> quit
221 Thank you for using x.com!

Once the file is download to the server, the next step is to divide the file into smaller pieces so that you will not waste too much time downloading should your Internet dial-up connection go down.

split -b 2000b bigfile.zip

You're almost there! Now simply log off from your telnet session and start downloading (from your provider's server to your computer) smaller files (not the bigfile.zip) which the above split command (on most UNIX compatible machines) created.

Once you have all the files, simply merge them and you got a good copy of the original bigfile.zip. If you're using DOS or Windows, you can merge files using the following command (assuming that bigfile.zip (16000 byte file) was divided into 8 2000 byte files named xaa to xah):

copy /b xaa xab xac xad xae xaf xag xah bigfile.zip

Don't forget to log back into your Internet service provider's computers and delete temporary files (bigfile.zip and xaa..xah) once you've made sure that your downloaded copy is in good condition. Doesn't sound like the above steps could make your downloads faster or more reliable? Well, in most cases it does; specially if the file you're downloading is rather large. The reason why this method could possibly be faster is because you can download files from your provider's computer to your computer faster than from most remote sites. Of course, the reason for reliability is that downloading many smaller files makes it possible to resume a download without wasting as much time as resuming a larger file download would.

 
 
Applicable Keywords : FTP, Mini Tutorial
 
 
 
Copyright © 2009 Chami.com. All Rights Reserved. | Advertise | Created in HTML Kit editor