How to properly handle a gzipped page when using curl?

BashCurlGzip

Bash Problem Overview


I wrote a bash script that gets output from a website using curl and does a bunch of string manipulation on the html output. The problem is when I run it against a site that is returning its output gzipped. Going to the site in a browser works fine.

When I run curl by hand, I get gzipped output:

$ curl "http://example.com"

Here's the header from that particular site:

HTTP/1.1 200 OK
Server: nginx
Content-Type: text/html; charset=utf-8
X-Powered-By: PHP/5.2.17
Last-Modified: Sat, 03 Dec 2011 00:07:57 GMT
ETag: "6c38e1154f32dbd9ba211db8ad189b27"
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Cache-Control: must-revalidate
Content-Encoding: gzip
Content-Length: 7796
Date: Sat, 03 Dec 2011 00:46:22 GMT
X-Varnish: 1509870407 1509810501
Age: 504
Via: 1.1 varnish
Connection: keep-alive
X-Cache-Svr: p2137050.pubip.peer1.net
X-Cache: HIT
X-Cache-Hits: 425

I know the returned data is gzipped, because this returns html, as expected:

$ curl "http://example.com" | gunzip

I don't want to pipe the output through gunzip, because the script works as-is on other sites, and piping through gzip would break that functionality.

What I've tried

  1. changing the user-agent (I tried the same string my browser sends, "Mozilla/4.0", etc)
  2. man curl
  3. google search
  4. searching stackoverflow

Everything came up empty

Any ideas?

Bash Solutions


Solution 1 - Bash

curl will automatically decompress the response if you set the --compressed flag:

curl --compressed "http://example.com"

> --compressed > (HTTP) Request a compressed response using one of the algorithms libcurl supports, and save the uncompressed document. If this option is used and the server sends an unsupported encoding, curl will report an error.

gzip is most likely supported, but you can check this by running curl -V and looking for libz somewhere in the "Features" line:

$ curl -V
...
Protocols: ...
Features: GSS-Negotiate IDN IPv6 Largefile NTLM SSL libz 

Note that it's really the website in question that is at fault here. If curl did not pass an Accept-Encoding: gzip request header, the server should not have sent a compressed response.

Solution 2 - Bash

In the relevant bug report Raw compressed output when not using --compressed but server returns gzip data #2836 the developers says:

> The server shouldn't send content-encoding: gzip without the client having signaled that it is acceptable.

> Besides, when you don't use --compressed with curl, you tell the command line tool you rather store the exact stream (compressed or not). I don't see a curl bug here...

So if the server could be sending gzipped content, use --compressed to let curl decompress it automatically.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionBryanHView Question on Stackoverflow
Solution 1 - BashMartinView Answer on Stackoverflow
Solution 2 - BashcweiskeView Answer on Stackoverflow