[GLLUG] TCP/IP protocol efficiency

Michael Watters michael at watters.ws
Wed Jan 24 12:55:44 EST 2007


Mike Szumlinski wrote:
> Realistically, raw ftp would probably work.  Is there an easy way to 
> curl multiple files over an http connection to a windows box?  The 
> Suse machine is a telecine and the files are individual film scans, so 
> the average transfer is going to be in the thousands of files.  Much 
> more difficult to click 6800 times than to experience the slow speed 
> we are getting over sftp.  Security isn't an issue at all, sftp was 
> just up and running by default on the machine and I didn't have to do 
> any config or start any more daemons.  The machine doesn't even have 
> internet access, it is on its own network transferring files to a 
> windows machine.
>
> Unfortunately, this machine is in California so I have no access to it 
> and I'm walking people through this that don't have a ton of linux 
> knowledge.  Oh yeah, and I'm running Fedora Core because I don't have 
> a copy of Suse and now that it is a pay product I can't just go 
> download it.
>
> So far I'm thinking straight FTP or HTTP might be the answer, but I'm 
> not sure how to get a multitude of files using HTTP.  I thought about 
> tarballing them, but 8k or 9k 12MB frames would suck up a considerable 
> amount of space and time making into a tarball just to speed up 
> transfer speed.
>
> -Mike 
If you're just serving static files check out nginx or thttpd, they're 
both faster than Apache for static content.  The nice thing about thttpd 
is throttling is built right in, and you can throttle by file type.

wget can download an entire directory at once or you could use rsync, 
both have windows versions available.



More information about the linux-user mailing list