HTTP Compression -- Any potential issues with doing this?
-
We are thinking about turning on the IIS-6 HTTP Compression to help with page load times. Has anyone had any issues with doing this, particularly from an SEO or site functionality standpoint? We just want to double check before we take this step and see if there are any potential pitfalls we may not be aware of. Everything we've read seems to indicate it can only yield positive results.
Any thoughts, advice, comments would be appreciated.
Thank-you,
Matt & Keith
-
Thanks.
-
Thanks.
-
I am aware that IE6 is old and many sites have dropped support for it. It's usage will vary by market. If the fix required 10 minutes of your time, you wouldn't do that for 1% or more of your potential customers?
If you have any Chinese users for instance, you'd want to make it work. Or if you're targeting people who are less tech-savvy or older in age, your IE6 usage numbers are bound to be higher. I agree that for most sites, it's probably not a huge issue. Since I experienced it on our site, I thought I'd mention it. If there is an issue, there is also likely a published fix that would require minimal effort.
-
You do realize that Microsoft has been trying to kill IE6 off, and just recently celebrated IE6 usage in the US dropping below 1%, right?
I wouldn't consider IE6 in your business plans.
-
Once you implement it, I'd check is that Internet Explorer 6 likes it. I can't remember the details, but when we added compression on our site, there were instances where IE6 didn't like it.
-
According to Google's Webmaster blog, Googlebot supports gzip and deflate
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).An incompatible compression would be the only downside to turning on compression.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you have an SSL cert but still have http?
I was under the impression that if you got an SSL cert for your site that the site would change to https. I ran this site: http://thekinigroup.com/ through an SSL checker and it said it had one...but it's http. 1. Why didn't it change to https? Is there an extra step there that needs to be done? 2. Is there a reason someone would choose to get an SSL cert, but not have https? Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
Some URLs were not accessible to Googlebot due to an HTTP status error.
Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.
Technical SEO | | GoldenRanking140 -
Can I have an http AND a https site on Google Webmaster tools
My website is https but the default property that was configured on Google WMT was http and wasn't showing me any information because of that. I added an https property for that, but my question is: do I need to delete the original HTTP or can I leave both websites?
Technical SEO | | Onboard.com0 -
Duplicate content issue with Wordpress tags?
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
Technical SEO | | BlueLinkERP0 -
Interesting indexing issue - any input would be greatly appreciated!
A few months ago we did SEO for a website, just like any other website. However, we did not see crawl/indexing results that we have with all of our other SEO projects - the Google webmaster tool was indicating that only 1 page of the site (although only 20 pages) was indexed. The site was older & originally developed in Dreamweaver, so although that shouldn't have been an issue, we were desperate to solve the problem & ended up rebuilding the site in WordPress. While this actually helped increase the number of pages on the site that Google indexed (now all 20) - we are still seeing strange things in the search results. For example, when we check rankings manually for a particular term, the new description is showing, however, it is displaying the old title text. Does anyone know what the problem could be? Thank you so much!!
Technical SEO | | ZAG0 -
Odd Google Indexing Issue
I have encountered something odd with Google indexing. According to the Google cache my site was last updated on April 6. I had been making a series of changes on April 7th and none of them show up in the cached version of the site (naturally). Then, on the 8th, my rankings seem to have dropped about 6 places and the main SERP is showing a text that isn't even on the Web site. The cached version has the correct page title from the page that was indexed on the 6th. How do I learn where Google is picking this up from? There is a clean page title tag on my Web site. I've checked the server, etc to see what's going on. The text isn't completely unrelated, but it definitely impacted my ranking. Does Google ever have these hiccups when indexing?
Technical SEO | | VERBInteractive0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Duplicate content issues with australian and us version of website
Good afternoon. I've tried searching for an answer to the following question but I believe my circumstance is a little different than what has been asked in the past. I currently run a Australian website targeted at a specific demographic (50-75) and we produce a LARGE number of articles on a wide variety of lifestyle segments. All of our focus up until now has been in Australia and our SEO and language is dedicated towards this. The next logical step in my mind is to launch a mirror website targeted at the US market. This website would be a simple mirror of a large number of articles (1000+) on subjects such as Food, Health, Travel, Money and Technology. Our current CMS has no problems in duplicating the specific items over and sharing everything, the problem is in the fact that we currently use a .com.au domain and the .com domain in unavailable and not for sale, which would mean we have to create a new name for the US targeted domain. The question is, how will mirroring this information, targeted towards US, affect us on Google and would we better off getting a large number of these articles 're-written' by a company on freelancer.com etc? Thanks,
Technical SEO | | Geelong
Drew0