How can I tell Google, that a page has not changed?
-
Hello,
we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot.
We would like to tell googlebot, to stop crawling pages that never change. This one for instance:
http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html
As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back.
The following header fields might be relevant. Currently our webserver answers with the following headers:
Cache-Control:
no-cache, must-revalidate, post-check=0, pre-check=0, public
Pragma:no-cache
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future?
I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us.
Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages.
Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages?
Thanks for your help
Cord
-
Unfortunately, I don't think there are many reliable options, in the sense that Google will always honor them. I don't think they gauge crawl frequency by the "expires" field - or, at least, it carries very little weight. As John and Rob mentioned, you can set the "changefreq" in the XML sitemap, but again, that's just a hint to Google. They seem to frequently ignore it.
If it's really critical, a 304 probably is a stronger signal, but I suspect even that's hit or miss. I've never seen a site implement it on a large scale (100s or 1000s of pages), so I can't speak to that.
Two broader questions/comments:
(1) If you currently list all of these pages in your XML sitemap, consider taking them out. The XML sitemap doesn't have to contain every page on your site, and in many cases, I think it shouldn't. If you list these pages, you're basically telling Google to re-crawl them (regardless of the changefreq setting).
(2) You may have overly complex crawl paths. In other words, it may not be the quantity of pages that's at issue, but how Google accesses those pages. They could be getting stuck in a loop, etc. It's going to take some research on a large site, but it'd be worth running a desktop crawler like Xenu or Screaming Frog. This could represent a site architecture problem (from an SEO standpoint).
(3) Should all of these pages even be indexed at all, especially as time passes? More and more (especially post-Panda), more indexed pages is often worse. If Googlebot is really hitting you that hard, it might be time to canonicalize some older content or 301-redirect it to newer, more relevant content. If it's not active at all, you could even NOINDEX or 404 it.
-
Thanks for the answers so far. The tips are not really solving my problems yet, though: I don't want to set down general crawling speed in the webmaster tools, because pages that frequently change should also be crawled frequently. We do have XML Sitemaps, although we did not include these picture pages, as in our example. There are ten- maybe houndreds- of thousands of these pages. If everyone agrees on this, we can include these pages in our XML Sitemaps of course. Using "meta refresh" to indicate, that the page never changed, seems a bit odd to me. But I'll look into it.
But what about the http headers, I asked about? Does anyone have any ideas on that?
-
Your best bet is to build an Excel report using a crawl tool (like Xenu, Frog, Moz, etc), and export that data. Then look to map out the pages you want to log and mark as 'not changing'.
Make sure to built (or have a functioning XML sitemap file) for the site, and as John said, state which URL's NEVER change. Over time, this will tell googlebot that it isn't neccessary yo crawl those page URL's as they never change.
You could also place a META REFRESH tag on those individual pages, and set that to never as well.
Hope some of this helps! Cheers
-
If you have Google Webmaster Tools set up, go to Site configuration > Settings, and you can set a custom crawl rate for you site. That will change it site-wide, so if you have other pages that change frequently, that might not be so great for you.
Another thing you could try is generate a sitemap, and set a change frequency of never (or yearly) for all of the pages you don't expect to change. That also might slow down Google's crawl rate of those pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Can anyone tell me - in layman's terms - any SEO implications of a Netscaler redirect?
We are in the midst of exploring the best options for developing a "microsite" experience for a client and how we manage the site - subdomain vs. subdirectory... Netscaler redirect vs DNS change. We understand that a subdirectory is best for SEO purposes; however, we anticipate technical limitations when integrating the different hosting platforms and capabilities into the existing site. The proposed solutions that were provided are a netscaler redirect and/or dns changes. Any experience with these solutions?
Technical SEO | | jgrammer0 -
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean?
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean? We have added some text in the pages with keywords thats related the page
Technical SEO | | AlexisWithers0 -
When you change your domain, How much time do I have to wait for google to return the traffic used to have?
Hello. 20 days ago, I changed my domain from uclasificados.net to uclasificados.com doing redirect 301 to all urls, and I started to loose rankings since that moment. I was wondering if changing it back could be the solutions, but some experts recommend me not to do that, because it could be worse. Right now I receave almost 50% of traffic I used to receave before, and I have done a lot of linkbuilding strategies to recover but nothing have worked until now. Even though I notified google of this change and I send again my new sitemap, I don't see that have improve my situation in any aspects, and I still see in webmastertools search stats from my last website (the website who used to be uclasificados.com before the change). What should I do to recover faster?
Technical SEO | | capmartin850 -
Duplicate pages or note? Variations just due to language changes?
I have some pages marked as duplicates, so I want to do what I can to solve the issues concerned. One issue concerns duplicates where the page content is indeed the same except for the language that the content is offered in. The URL for example of the documentation page of the site, in English is as follows:
Technical SEO | | PulseAnalytics
http://www.domain.com/support/documentation We then have the same content in German, French, Russian using the following URLs.
http://www.domain.com/de/support/documentation
http://www.domain.com/fr/support/documentation
http://www.domain.com/ru/support/documentation Each page has links to PDFs which are all in fact in English so the links to the docs are the same. Moz is flagging up all these pages as being duplicate content (which it is when translated back into English, but is not if you just consider that they are using completely different languages!) Has anyone any thoughts on how to solve this? Or is this something not to worry about / disregard? Many thanks Simon0 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
My site has dropped in rankings what can i do to change this
My site is http://www.clairehegarty.co.uk/ Hi, my site has always done amazing in the rankings, for a few years i have been number one for the word gastric band hypnotherapy as well as many other keywords which includes hypno band. but in the past couple of weeks i have seen some of my keywords drop and end up on pages two and three of google instead of page one. Can anyone please give me advice on what i need to do to change this situation please
Technical SEO | | ClaireH-1848860 -
What changes do i need to make to my site to get into google news
Hi, when we had the old design, we were in google news but then when we upgraded our site, we had a major problem which forced us to have to redesign our site. Since then we have not been included in google news and we would like to get back in. We only want to be in google news for the following page http://www.in2town.co.uk/Latest-News-Headlines But for some reason, no matter what we do we keep getting knocked back. I would love to know what we should be doing to get into google news and see what the problems are. We have moved to a bigger dedicated server to increase speed so i know it is not that. Any help would be great Also is there an alternative to google news that i can get our site into to generate traffic and to get our news stories straight out to people Hi, Thank you for your note. We appreciate your interest in sharing your content with us. However, when we reviewed your site, we found that we cannot include it in Google News at this time. We have certain guidelines in place regarding the quality of sites which are included in Google News. Please feel free to review these guidelines at the following link: http://www.google.com/support/news_pub/bin/answer.py?hl=en&answer=40787 We know it can be frustrating to not have more information about this but we appreciate your efforts and understanding. We will log your site for future consideration. Please keep in mind that we will be unlikely to review your site for at least 60 days following this email. Thanks for your understanding and your continued interest in Google News. Regards,
Technical SEO | | ClaireH-184886
The Google News Team0