Many sites use gzip to compress their code so that it downloads faster. This is a good thing. It does make the web browser work a bit harder (content needs to be put back together after the compression) but shortens the time it takes to show the users a web page).
However, we've sometimes come across sites that are misconfigured – perhaps they have two places where they check if a browser supports gzip compression – and compress the content twice. The effect of that is that when we've decompressed it once and expect to have HTML to show to the user, we still only have a mess of binary data. These sites show up as gibberish – no text, no links, entirely unreadable and unusable in Opera. The most recent bigger site we saw doing this was cars.com, but it appears to have corrected the problems now.
I think this problem is usually caused by server-side browser sniffing. We're now considering changing Opera to try to detect double-gzip scenarios and decompress once more if double gzipping is detected.
With this test I tried to figure out what other browsers do, believing that some of them may have come across the same problem. Surprise – all browsers fail. Or perhaps not really surprising, given that unpacking twice is a somewhat weird thing to do. So does this problem really only affect Opera? And should we respond by doing the slightly weird twice-unpack thing that no other browsers do?