Compression is a Go
So I continued my search for compression on the fly for a while yesterday — I found nothing else. So I’ve setup Apache 2.0w/ mod_deflate (yes it does gzip too) on the remote server. For the downloading machine (which I want to script), I found cURL. If you run cURL with
--compression it will attempt to download the file using compression.
My tests were all done locally (since I dont care about the actuall speed in the end, just the compression percentage) and it goes like this. Original file (MSSQL Backup): 212megs. RAR’d: 10% of original size. Apache w/ gzip: 20% of original size. This is really, really good in my opinion because I’m going to be using it to backup 15gigs of Databases. RAR’ing that 15gigs took about 2.5 hours (note: This doesnt include the hour or so it takes to download the RAR) — The same amount of time it will take to download thoes files if they are run through Apache w/ gzip.
Overall savings (from non-compression) — Bandwidth: 13gigs. Time: 12 hours