[GLLUG] TCP/IP protocol efficiency
Lachniet, Mark
mlachniet at analysts.com
Wed Jan 24 12:46:21 EST 2007
you might try wget.exe on the windows side - it can be used to batch-transfer files over HTTP pretty efficiently. Its one of the files I always keep on my web site for when I'm at a customer location and need a tool:
http://lachniet.com/wget.exe
You can compile it to work with SSL, but I don't know if the one on my site does..
Mark Lachniet
Technical Director, Security Group
Analysts International
(517) 336-1004 (voice)
(517) 336-1160 (fax)
mailto: mlachniet at analysts.com
From: Mike Szumlinski
Sent: Wed 1/24/2007 12:16 PM
To: Charles Ulrich
Cc: linux-user at egr.msu.edu
Subject: Re: [GLLUG] TCP/IP protocol efficiency
Realistically, raw ftp would probably work. Is there an easy way to
curl multiple files over an http connection to a windows box? The
Suse machine is a telecine and the files are individual film scans,
so the average transfer is going to be in the thousands of files.
Much more difficult to click 6800 times than to experience the slow
speed we are getting over sftp. Security isn't an issue at all, sftp
was just up and running by default on the machine and I didn't have
to do any config or start any more daemons. The machine doesn't even
have internet access, it is on its own network transferring files to
a windows machine.
Unfortunately, this machine is in California so I have no access to
it and I'm walking people through this that don't have a ton of linux
knowledge. Oh yeah, and I'm running Fedora Core because I don't have
a copy of Suse and now that it is a pay product I can't just go
download it.
So far I'm thinking straight FTP or HTTP might be the answer, but I'm
not sure how to get a multitude of files using HTTP. I thought about
tarballing them, but 8k or 9k 12MB frames would suck up a
considerable amount of space and time making into a tarball just to
speed up transfer speed.
-Mike
On Jan 24, 2007, at 11:45 AM, Charles Ulrich wrote:
> On Wednesday 24 January 2007 11:23, Mike Szumlinski wrote:
>> I've currently gotten involved with a project that is requiring I
>> reshare thousands of 12MB files out over TCP/IP from an XFS volume
>> on SuSe for use with Windows systems. I have tried sftp and found
>> the transfer speed to be somewhat less than acceptable over gig-e and
>> was wondering if anyone had any input into another method of sharing
>> out the files that would be A) easier and B) faster. Obviously samba
>> comes to mind, but I'm wondering if there are other easier ways to
>> tune this connection. Any suggestions/ideas would be much
>> appreciated.
>>
>> -Mike
>
> Samba is a CPU hog for transfers over fast connections, I'd
> recommend against
> it. HTTP is pretty much going to be the fastest protocol out of the
> box,
> especially for large very large files since the headers are only
> sent once
> per file. If encryption is necessary, there may not be a super-fast
> solution,
> though I think HTTPS may be faster than SFTP. With SFTP, the
> encryption
> algorithm you use is definitely going to be a bottleneck at gig-e
> speeds. I'm
> not sure what choices you have and what's available, but perhaps
> see if you
> can choose a faster algorithm before switching protocols entirely.
>
> --
> Charles Ulrich
> Ideal Solution, LLC -- http://www.idealso.com
> _______________________________________________
> linux-user mailing list
> linux-user at egr.msu.edu
> http://mailman.egr.msu.edu/mailman/listinfo/linux-user
_______________________________________________
linux-user mailing list
linux-user at egr.msu.edu
http://mailman.egr.msu.edu/mailman/listinfo/linux-user
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.egr.msu.edu/mailman/public/linux-user/attachments/20070124/362ced7a/attachment.html
More information about the linux-user
mailing list