HTTP Compression -- Any potential issues with doing this?
-
We are thinking about turning on the IIS-6 HTTP Compression to help with page load times. Has anyone had any issues with doing this, particularly from an SEO or site functionality standpoint? We just want to double check before we take this step and see if there are any potential pitfalls we may not be aware of. Everything we've read seems to indicate it can only yield positive results.
Any thoughts, advice, comments would be appreciated.
Thank-you,
Matt & Keith
-
Thanks.
-
Thanks.
-
I am aware that IE6 is old and many sites have dropped support for it. It's usage will vary by market. If the fix required 10 minutes of your time, you wouldn't do that for 1% or more of your potential customers?
If you have any Chinese users for instance, you'd want to make it work. Or if you're targeting people who are less tech-savvy or older in age, your IE6 usage numbers are bound to be higher. I agree that for most sites, it's probably not a huge issue. Since I experienced it on our site, I thought I'd mention it. If there is an issue, there is also likely a published fix that would require minimal effort.
-
You do realize that Microsoft has been trying to kill IE6 off, and just recently celebrated IE6 usage in the US dropping below 1%, right?
I wouldn't consider IE6 in your business plans.
-
Once you implement it, I'd check is that Internet Explorer 6 likes it. I can't remember the details, but when we added compression on our site, there were instances where IE6 didn't like it.
-
According to Google's Webmaster blog, Googlebot supports gzip and deflate
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).An incompatible compression would be the only downside to turning on compression.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Historic issue with incomplete indexing
Hi there We run quite a big site in the UK in the commercial real-estate space. Historically we have always had a challenge getting our "primary" landing pages indexed, which are location based property result pages. e.g. https://realla.co/to-rent/commercial-property/oxford For example, for the "towns" category we have 8,549 submitted in our xml sitemap, with only 3,171 indexed. This is a general issue across all our sitemaps. 120k submitted, 80k indexed. Our pages are linked through breadcrumbs, and nearby links. In the new search console these pages are reported as "crawled - currently not indexed" These all sit under the folder: site:https://realla.co/to-rent/commercial-property/* site:https://realla.co/to-rent/office/* We have done extensive work to optimise performance, including AMP pages. Each location page has many details pages for individual properties e.g. https://realla.co/to-rent/details/0ffbbd0a1a1147edb8847c5ce6179509 One action we have remaining is to nest the details under the locations pages, which may help. These details pages are indexed fully. Any feedback much appreciated
Technical SEO | | ianparryuk0 -
Http -> https redirections / 301 the right way
Dear mozers, Thank you for your time reading the message and wanting to help! So, we have moved our WordPress to https and redirected all the content successfully via htaccess file. We used a simple 301 redirect plugin, which we are using to redirect old URLs to the new ones. The problem today is, the redirections in the plugin are not working for http version. Here is an example: htaccess redirect: http --> https Plugin redirect domain.com/old --> domain.com/new but, the url http://domain.com/old is not redirecting to https://domain.com/new while https://domain.com/old does redirects to https://domain.com/new What can you suggest as a solution? Thank you in advance! P.S. I don't think having 2 redirects for each version of the URL is the smartest solution Best wishes, Dusan
Technical SEO | | Chemometec0 -
I am having an issue with my rankings
I am having an issue with my rankings. I am not sure if there are issues with onpage dup content or with the way wordpress is behaving but there is no reason based upon the sites back link profile that the site shouldn't be ranking well. The site is mesocare.org. If anyone can help it would be appreciated.
Technical SEO | | weitzluxenberg0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Is the If-Modified-Since HTTP Header still relevant?
I'm relatively new to the technical side of SEO and have been trying to brush up my skills by going through Google's online Web-master Academy, which suggests that you need a If-Modified-Since HTTP Header tag on your site. I checked and apparently our web server doesn't support this. I've been told by a good colleague that the If-Modified-Since tag is no longer relevant as the spiders will frequently revisit a site as long as you regularly update and refresh the content (which we do). However our site doesn't seem to of been reindexed for a while as the cached version's are still showing the pages from over a month ago. So two question really - is the If-Modified-Since HTTP Header still relevant and should I make sure this is included? And is there anything else I should be doing to make sure the spiders crawl our pages? (apart from keeping them nice, fresh and useful)
Technical SEO | | annieplaskett0 -
Drupal 1.5 Issue: Taxonomy
Hi there I have a domain which is built in Drupal 1.5 . We managed to redirect all nodes to the actial SEF URL. The one issue we have no is redirecting the taxonomy urls to the SEF url. The obviuos answr is to do a manual 301 redirect n the htaccess file but this will a long process as there are over 500 urls affected. Is there a better way to do this automatically within Drupal? Your thoughts and ideas are welcome.
Technical SEO | | stefanok0 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0 -
Website Ranking Issue
Hi, We have been performing our own onsite of offsite SEO along with external assistance and have ranked well over the years with minimal impact from Google updates. Howevr the last so called Panda update has affected us heavily pushing our main phrase 'web design melbourne' from 2nd to 7th where we have been for almost 2 months now on Google.com.au irrespective of onsite or offsite work. We have been trying to find signs of any onsite, IP, duplicate content, titles or other issues that may be holding us back to no avail. The only flag that Google webmaster tools is showing is a number of bad internal site links, which I think is a glitch with the CMS we are using. Even the SEO MOZ tool gives us a higher ranking compared to most competitors on page 1 of Google.com.au for our main phrase. The biggest difference between us and competitors is we chose to target an internal page specific to the topic rather than our homepage. With this sadi we have also reduced our keyword density and content quantity inline with the other sites homepages. Can anyone help shed some light on this? and perhaps something obvious that we have missed, or where we should be looking? Thanks.
Technical SEO | | paulsid0