Pitfalls when implementing the “VARY User-Agent” server response
-
We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type.
While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004 / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down / http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html
As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot).
So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback.
You guys see any issues/problems with implementing this solution?
Cheers!
linklater
-
So, there are lots of 'ifs' here, but the primary problem I see with your plan is that the CDN will return the content to Googlebot without the request hitting your server so you won't have the option to serve different headers to Googlebot.
Remember that every page is the main HTML content (which may be static or dynamically generated for every request), and then a whole bunch of other resources (Javascript and CSS files, images, font files etc.). These other resources are typically static and lend themselves far better to being cached.
Are your pages static or dynamic? If they are dynamic then you are possibly not benefitting from them being cached anyway, so you could use the 'vary' header on just these pages, and not on any static resources. This would ensure your static resources are cached by your CDN and give you a lot of the benefit of the CDN, and only the dynamic HTML content is served directly from the server.
If most of your pages are static you could still use this approach, but just without the full benefit of the CDN, which sucks.
Some of the CDNs are already working on this (see http://www.computerworld.com/s/article/9225343/Akamai_eyes_acceleration_boost_for_mobile_content and http://orcaman.blogspot.co.uk/2013/08/cdn-caching-problems-vary-user-agent.html) to try and find better solutions.
I hope some of this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Matching user intent in my blog
Hello, I am planning on doing a blog on travel bike basic. I noticed that the google keyword tool gives me things like How to plan a bike route Can I bike during pregnancy etc... In order to compete on that keyword do I need to answer those questions or can I answer different ones and still rank such as : The tool kit that are recommend. Whether you should take an insurance or not, if so which one. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Best server-side xml sitemap generator?
I have tried xml-sitemaps which tends to crash when spidering my site(s) and requires multiple manual resumes which aren't practical for our businesses. Please let me know if any other server-side generators that could be used on multiple enterprise-sized websites exist that could be a good fit. Image sitemaps would also be helpfu.l +++One with multiple starting URLs would help spidering/indexing the most important sections of our sites. Also, has anyone heard of or used Dyno Mapper? This also looks like a good solution for us, but was wondering if anyone has had any experience with this product.
Intermediate & Advanced SEO | | recbrands0 -
70 sites on one instance/server negative for SEO?
Hi Guys, One of our clients is building individual sites for each store they have, which in total would be 70 different websites on one server (they used the word instance). I was wondering if there could be negative issues with this for SEO purposes? Cheers, Mike
Intermediate & Advanced SEO | | wozniak650 -
Domain remains the same IP address is changing on same server only last 3 digits changing. Will this effect rankings
Dear All, We have taken and a product called webacelator from our hosting UKfast and our ip address is changing. UKFasts asked to point DNS to different IP in order to route the traffic through webacelator, which will enhance browsing speed. I am concerned, will this change effect our rankings. Your responses highly appreciated.
Intermediate & Advanced SEO | | tigersohelll0 -
Our site is on a secure server (https) will a link to http:// be of less value?
Our site is hosted on a secure network (I.E. Our web address is - https://www.workbooks.com). Will a backlink pointing to: http://www.workbooks.com provide less value than a link pointing to: https://www.workbooks.com ? Many thanks, Sam
Intermediate & Advanced SEO | | Sam.at.Moz0 -
Servers matter for different domains for same keywords?
We have 3 domains in the same industry, but unique content on each site. The whois info is private, but the domains are all under the same owner registrant info. In terms of rankings, does it matter if we move them all into the same RackSpace cloud account? Will Google know the relationship somehow and if they do, will they care?
Intermediate & Advanced SEO | | TheDude0 -
Server requests: 302 followed by a 200
Hi, On an IIS system clicking a particular link the following response codes are returned: GET /nl/nl/process?Someparameter1=1&Someparameter2=2 302 found GET /nl/nl/SomeOtherPage.cms 200 OK What concerns me, besides the obvious 302 and the cAmeLcAse canonical issues is the 200 response without a redirect.
Intermediate & Advanced SEO | | Muffin
What page will then be indexed, ranked and what effect does this have on the pagerank flow, if the 302 was to be changed into a 301?
Also would extention .cms be an issue? Thanks for any answers. Edit. I contacted the developer. He says it's a rewrite, not a meta redirect.
I still think, this rewrite is an issue? Canonical maybe?0 -
How to handle web server downtime?
We have a client who is taking their web server down Saturday morning from 1am - 7am for planned maintenance. Initially, we thought to have all requests return a 503 (service unavailable) response but the web server itself will be down so we are not able to have it return any response codes. Updating the DNS on the registrar will have too much lag time while it propogates out so we aren't sure exactly how to handle this. I had thought possibly of using a second DNS, or a service like DynDNS but that seems like a large amount of effort to set up just for some planned downtime. I have to imagine that Google understands planned website/server downtime every once in a great while. This client has pretty good rankings for some incredibly competitive terms so we want to do all that we can to make sure those rankings are preserved. What are some other potential solutions? We could totally just be overthinking this but we'd rather be safe than sorry... Thanks in advance!
Intermediate & Advanced SEO | | MichaelWeisbaum0