HTTP Compression -- Any potential issues with doing this?
-
We are thinking about turning on the IIS-6 HTTP Compression to help with page load times. Has anyone had any issues with doing this, particularly from an SEO or site functionality standpoint? We just want to double check before we take this step and see if there are any potential pitfalls we may not be aware of. Everything we've read seems to indicate it can only yield positive results.
Any thoughts, advice, comments would be appreciated.
Thank-you,
Matt & Keith
-
Thanks.
-
Thanks.
-
I am aware that IE6 is old and many sites have dropped support for it. It's usage will vary by market. If the fix required 10 minutes of your time, you wouldn't do that for 1% or more of your potential customers?
If you have any Chinese users for instance, you'd want to make it work. Or if you're targeting people who are less tech-savvy or older in age, your IE6 usage numbers are bound to be higher. I agree that for most sites, it's probably not a huge issue. Since I experienced it on our site, I thought I'd mention it. If there is an issue, there is also likely a published fix that would require minimal effort.
-
You do realize that Microsoft has been trying to kill IE6 off, and just recently celebrated IE6 usage in the US dropping below 1%, right?
I wouldn't consider IE6 in your business plans.
-
Once you implement it, I'd check is that Internet Explorer 6 likes it. I can't remember the details, but when we added compression on our site, there were instances where IE6 didn't like it.
-
According to Google's Webmaster blog, Googlebot supports gzip and deflate
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).An incompatible compression would be the only downside to turning on compression.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Google Cache issue
Hi, We’ve got a really specific issue – we have an SEO team in-house, and have had numerous agencies look at this – but no one can get to the bottom of this. We’re a UK travel company with a number of great positions on the search engines – our brand is www.jet2holidays.com. If you try ‘Majorca holidays’, ‘tenerife holidays’, ‘gran canaria holidays’ etc you’ll see us in the top few positions on Google when searching from the UK. However, none of our destination pages (and it’s only the destination pages), show a ‘cached’ option next to them. Example: https://www.google.com/search?q=majorca+holidays&oq=majorca+holidays&aqs=chrome..69i57j69i60l3.2151j0j9&sourceid=chrome&ie=UTF-8 This isn’t affecting our rankings, but we’re fairly certain it is affecting our ability to be included in the Featured Snippets. Checked and there aren’t any noarchive tags on the pages, example: https://www.jet2holidays.com/destinations/balearics/majorca Anyone have any ideas?
Technical SEO | | fredgray0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Switching to HTTPS from HTTP
Hi, I have switched my website from http to https but now I my facing a problem when I type URL with www ( e.g https://www.example.com) in browser that is working fine but when I type URL without www like https://example.com "This connection is untrusted" error is showing . All I want to is redirect https://example.com to https://www.example.com. Please let me know how can I redirect? Thanks
Technical SEO | | Alick3000 -
I have 300 rel caonical issues with my magentogo store irisscottprints.com.
I can't find answer in mgo blogs. what if anything specifically should I do with these issues? I have control over meta tags but not easy to edit html in the admin. I've read some posts here but still confused. Answers would be great but if there is a primer on the subject and it relates to what is editable in Mgo store I'd love to figure it out myself:) Help.
Technical SEO | | RedTrout0 -
Duplicate Content Issue: Google/Moz Crawler recognize Chinese?
Hi! I am using Wordpress multisite and my Chinese version of the website is in www.mysite.com/cn Problem: I keep getting duplicate content errors within www.mysite.com/cn (NOT between www.mysite.com and www.mysite.com/cn) I have downloaded and checked the SEOmoz report and duplicate_page_content list in CSV file. I have no idea why it says they have the same content., they have nothing in common in content . www.mysite.com is the English version of the website,and the structure is the same for www.mysite.com/cn *I don't have any duplicate content issues within www.mysite.com Question: Does google Crawler properly recognizes chinese content??
Technical SEO | | joony20080 -
Database Driven Websites: Crawling and Indexing Issues
Hi all - I'm working on an SEO project, dealing with my first database-driven website that is built on a custom CMS. Almost all of the pages are created by the admin user in the CMS, pulling info from a database. What are the best practices here regarding SEO? I know that overall static is good, and as much static as possible is best, but how does Google treat a site like this? For instance, lets say the user creates a new page in the CMS, and then posts it live. The page is rendered and navigable, after putting together the user-inputed info (the content on the page) and the info pulled from the database (like info pulled out to create the Title tag and H1 tags, etc). Is this page now going to be crawled successfully and indexed as a static page in Google's eyes, and thus ok to start working on rank for, etc? Any help is appreciated - thanks!
Technical SEO | | Bandicoot0 -
Rel = author display issue
I want to enter some products as blog posts. I don't want users to see the post info, but do want SE's to see rel="author". I can do this by setting display to "none" in a CSS style. The post info does not appear in the browser but is still in the page source. Will search engines be able to see the post info?
Technical SEO | | waynekolenchuk0