Hi,
We are actively using ASP.NET core for production sites. We are super happy about the Response Compression and overall performance. However, for some controllers we need to turn off the compression - specifically in our case, Bing and Google Webmaster Tools do not support Gzip/Brotli compressed sitemaps.
Question: Is there a directive we can add specifically to prevent certain compression of certain pages/controllers to accommodate Bing/Google? e.g.
[ResponseCompression=false]
[Route("/sitemap.xml")]
public async Task<IActionResult> IndexAsync() {
// code here.
}
⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.
Bing and Google Webmaster Tools do not support Gzip/Brotli compressed sitemaps.
If they don't support them, then requests that they make won't specify them in the Accept-Encoding header and thus the middleware won't compress. Have you looked at their requests to confirm what they're sending in Accept-Encoding?
I'll close as answered.
@guardrex - that is the official message I got from being in contact with them, basically one of the Bing support engineers wrote "Bing doesn't support gzip..." - which I find odd in all cases.
That should be ok then ... I mean they aren't going to make requests with the Accept-Encoding header with a gzip or brotli value, so the middleware won't compress into them ... the app should just send back a plain application/xml response (if that's what ur sending back ... i.e., sitemap). You should be :+1:.
@guardrex Okay I understand that.
I appreciate your comments though 👍
Hum ... that's strange.
I'd capture that header from them ... log it ... inject an ILogger<T> into that controller and something like the following in the method that builds your sitemap ...
_logger.LogError(
$"Sitemap request: Accept-Encoding: {Request.Headers["Accept-Encoding"].ToString()}");
For down and dirty logging, note that I'm logging Error level here to make sure it gets logged when the app is in the Production environment with the default logging levels.
It would be bad for them to submit the header with, for example, gzip, and then reject a compressed sitemap sent to them. That would be kind'a mean. :smile:
Otherwise, your responses aren't compressed by the middleware, so there would be something else causing them to be rejected.
Hehe, it would be mean indeed but I suspect it is something else. It is very strange indeed - and we even asked a few times "are you really sure it is compression problem, because some of the other sitemaps are indeed working?".
For this particular sitemap, it is bit larger - something like 1 mb, and the others 200kb - 500kb. Bing tool also reports a count of all the links in that very same failing sitemap.
Nice thinking there with the logger - I will definitely try that suggestion and will keep you in the loop just for closure, thanks @guardrex :)
Sure thing ... and to check the app itself, try a tool like Fiddler or Postman. Send and don't send the Accept-Encoding header in some requests for the sitemap ... you'll see that the middleware either compresses or doesn't compress the response based on the header value(s) sent.
Good luck with it .... Hang in there! I'm pull'in for ya!
Indeed, it looks like gzip, deflate is supported by Bingbot:
Wed, 06 Mar 2019 04:04:13 GMT
#### SITEMAP REQUEST, 207.46.13.56 ####
Cache-Control: no-cache
Connection: Keep-Alive
Pragma: no-cache
Accept: */*
Accept-Encoding: gzip, deflate
Host: www.example.com
User-Agent: Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)
MS-ASPNETCORE-TOKEN: 821d74ca-e8a6-41ae-a9ff-e1c5609e92de
X-Original-Proto: http
X-Original-For: 127.0.0.1:54691
Response from server using fiddler:
HTTP/1.1 200 OK
Transfer-Encoding: chunked
Content-Type: text/xml
Content-Encoding: gzip
Vary: Accept-Encoding
Server: Kestrel
Will await Bing engineer's response ;)
@guardrex - finally got an answer from Bing engineer:
Compression was not the issue, we had 3 duplicate links out of 12000 links.
Thanks again 🥇