HTTP Compression -- Any potential issues with doing this?
-
We are thinking about turning on the IIS-6 HTTP Compression to help with page load times. Has anyone had any issues with doing this, particularly from an SEO or site functionality standpoint? We just want to double check before we take this step and see if there are any potential pitfalls we may not be aware of. Everything we've read seems to indicate it can only yield positive results.
Any thoughts, advice, comments would be appreciated.
Thank-you,
Matt & Keith
-
Thanks.
-
Thanks.
-
I am aware that IE6 is old and many sites have dropped support for it. It's usage will vary by market. If the fix required 10 minutes of your time, you wouldn't do that for 1% or more of your potential customers?
If you have any Chinese users for instance, you'd want to make it work. Or if you're targeting people who are less tech-savvy or older in age, your IE6 usage numbers are bound to be higher. I agree that for most sites, it's probably not a huge issue. Since I experienced it on our site, I thought I'd mention it. If there is an issue, there is also likely a published fix that would require minimal effort.
-
You do realize that Microsoft has been trying to kill IE6 off, and just recently celebrated IE6 usage in the US dropping below 1%, right?
I wouldn't consider IE6 in your business plans.
-
Once you implement it, I'd check is that Internet Explorer 6 likes it. I can't remember the details, but when we added compression on our site, there were instances where IE6 didn't like it.
-
According to Google's Webmaster blog, Googlebot supports gzip and deflate
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).An incompatible compression would be the only downside to turning on compression.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
2 sites using 1 CMS... issues?
Hi, We are working with a client that has 2 sites in the same sector. They are currently on separate servers, with separate blogs, images galleries etc. Both sites rank combined for over 200 terms. IF we were to "combine" the sites on one CMS, with one IP, two separate front ends, one blog stream, one image gallery what do you think the SEO impact would be from this? We had an issue with another client whose sites were too close and we had to separate in order to get them both to rank. Further to this we want both sites to now have their own https certificate however this wouldn't be possible if combined. Interested to hear thoughts on this. Thanks
Technical SEO | | lauratagdigital0 -
All images are noindex will opening this at once be an issue?
Hi, All images are noindex will opening this at once be an issue? Not sure how a few months ago all my images were set as noindex which i realized last week. We have 20K images which were indexed fine but now when i check Site:sitename it shows 10 or 12 and the inspect element via Chrome i see the noindex is set for all images. We have been renaming the images and adding ALT tags for most of them and would it be an issue if we change the noindex in one shot or should we do them few at a time? Thanks
Technical SEO | | mtthompsons0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Product Duplicate Content Issue with Google Shopping
I have a site with approx 20,000 products. These products are resold to hundreds of other companies and are fed from one database therefore the content is duplicated many many times. To overcome this, we are launching the site with noindex meta tags on all product pages. (In phase 2 we will begin adding unique content for every product eek) However, we still want them to appear in Google Shopping. Will this happen or will it have to wait until we remove the noindex tags?
Technical SEO | | FGroup0 -
Drupal issue
Hi seomozzers again, One of my clients uses DRUPAL(cms) and I have an issue when editing any pages. I access the edit section of the page, try to insert meta description tags, save and view the page source and NO meta description tags appear!! Why is that? Is there a specific setting that I need to implement? Under Meta Tags, apparently DRUPAL likes to put canonical tags by default(unless i can tweek one of the settings), and I would like to remove them. The weird thing is that even there are no canonical tags set, when viewing the page source I can still locate a canonical tag. Is there a setting that allow me to remove the canonical by default? Thank you mozzers:)
Technical SEO | | Ideas-Money-Art0 -
SEO-MOZ bar question on root vs subdomain / canonicalization issues
When I look at the SEO-MOZ bar for our site and click next to subdomain (# links from #domains) it shows my main incoming links etc. but when I click on root domain ity shows mydomain/default.asp and 4 incoming links as well as a message that says this url redirects to another url. Does this imply canonicalization issues or is there a 301 redirect to my non /default.asp correcting this issue. Thanks kindly, Howard
Technical SEO | | mrkingsley0 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0