How can I tell Google, that a page has not changed?
-
Hello,
we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot.
We would like to tell googlebot, to stop crawling pages that never change. This one for instance:
http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html
As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back.
The following header fields might be relevant. Currently our webserver answers with the following headers:
Cache-Control:
no-cache, must-revalidate, post-check=0, pre-check=0, public
Pragma:no-cache
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future?
I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us.
Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages.
Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages?
Thanks for your help
Cord
-
Unfortunately, I don't think there are many reliable options, in the sense that Google will always honor them. I don't think they gauge crawl frequency by the "expires" field - or, at least, it carries very little weight. As John and Rob mentioned, you can set the "changefreq" in the XML sitemap, but again, that's just a hint to Google. They seem to frequently ignore it.
If it's really critical, a 304 probably is a stronger signal, but I suspect even that's hit or miss. I've never seen a site implement it on a large scale (100s or 1000s of pages), so I can't speak to that.
Two broader questions/comments:
(1) If you currently list all of these pages in your XML sitemap, consider taking them out. The XML sitemap doesn't have to contain every page on your site, and in many cases, I think it shouldn't. If you list these pages, you're basically telling Google to re-crawl them (regardless of the changefreq setting).
(2) You may have overly complex crawl paths. In other words, it may not be the quantity of pages that's at issue, but how Google accesses those pages. They could be getting stuck in a loop, etc. It's going to take some research on a large site, but it'd be worth running a desktop crawler like Xenu or Screaming Frog. This could represent a site architecture problem (from an SEO standpoint).
(3) Should all of these pages even be indexed at all, especially as time passes? More and more (especially post-Panda), more indexed pages is often worse. If Googlebot is really hitting you that hard, it might be time to canonicalize some older content or 301-redirect it to newer, more relevant content. If it's not active at all, you could even NOINDEX or 404 it.
-
Thanks for the answers so far. The tips are not really solving my problems yet, though: I don't want to set down general crawling speed in the webmaster tools, because pages that frequently change should also be crawled frequently. We do have XML Sitemaps, although we did not include these picture pages, as in our example. There are ten- maybe houndreds- of thousands of these pages. If everyone agrees on this, we can include these pages in our XML Sitemaps of course. Using "meta refresh" to indicate, that the page never changed, seems a bit odd to me. But I'll look into it.
But what about the http headers, I asked about? Does anyone have any ideas on that?
-
Your best bet is to build an Excel report using a crawl tool (like Xenu, Frog, Moz, etc), and export that data. Then look to map out the pages you want to log and mark as 'not changing'.
Make sure to built (or have a functioning XML sitemap file) for the site, and as John said, state which URL's NEVER change. Over time, this will tell googlebot that it isn't neccessary yo crawl those page URL's as they never change.
You could also place a META REFRESH tag on those individual pages, and set that to never as well.
Hope some of this helps! Cheers
-
If you have Google Webmaster Tools set up, go to Site configuration > Settings, and you can set a custom crawl rate for you site. That will change it site-wide, so if you have other pages that change frequently, that might not be so great for you.
Another thing you could try is generate a sitemap, and set a change frequency of never (or yearly) for all of the pages you don't expect to change. That also might slow down Google's crawl rate of those pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can google bots read my internal post links if they are all listed in a javascript accordian where I list my sources?
I post a JavaScript accordion drop down tab [ a collapsible content area ] at the end of all my posts. I labeled the accordion "Show Article Sources"., and when a user clicks it, then the accordion expands open and it shows all the sources I listed for my article. And this is where I post all of my articles links that I reference per each article. But I read somewhere that google crawlers can not read text in a drop down JavaScript tab. So I am wondering now if this is true because that would mean I have no internal linking SEO going on since it cant read the links? ..... if it is true, then I should remove the accordion from all my articles and some how include the links I reference in the actual body text so I can get SEO benefits from external linking similar content? If that's true, what is an aesthetic way to do this, any example links? Tips ? Thoughts ?
Technical SEO | | ianizaguirre0 -
URL structure change for pages without traffic: 301 redirect or not ?
Hi, I am just starting with MOZ PRO and trying to handle the high priority issues, starting with pages with 4XX Client Error. I am wondering what we should do with pages with no traffic and no external links. For instance: So time ago we change the URL structure of our blog to a flatter one, and so eg we moved a page: from: domain-name/dla-rodzicow/poradniki/poradniki-po-markach/vilac/vilac-zabawki-z-dusza to: domain-name/dla-rodzicow/poradniki/marka-vilac/vilac-zabawki-z-dusza/ Still not very flat but this is not the point. MOZ PRO shows we are having internal links to the old url. According to MOZ PRO, we don't have external links. According to Analytics we have no traffic on the old page. So now we changed the internal link, and I am wondering whether we should 301 redirect the old page to the new one, or whether a sitemap update is enough for this kind of pages ? Thanks in advance for your help.
Technical SEO | | isabelledylag0 -
Google Listing Brand Name as Page Title rather than actual set Title
Any search result including our website is displaying our search result title like this: "Brand Name: Partial Page Title" instead of what it should be "Page Title" (as an example). Where is the "Brand Name:" coming from? We've verified we don't have any code that could cause this, and we're blocking in robots.txt directory listings being used for our search result title/meta. This isn't happening for our competitors. Any ideas on why this is, and if it's something we can control?
Technical SEO | | Closetstogo0 -
How can I stop google indexing an image
I have put a map of cornwall on my site on the Corwnall Page, and for some reason Google.de has picked it up and shows it up in the top 4 images for a search for cornwall? The result is I am getting about 80% of the traffic coming to my site for the search Cornwall (I get about 50 unique visits per day, over 40 a day are landing on the Cornwall page. Is this a problem for my normal SEO as a Close up Magician? Will google start to think my site is about Cornwall? Should I noindex the image (I say that like I know how! - How do I noindex that image? ) Or is any traffic to a site good traffic, I imagine they will be clicking on the link landing on the page and then leaving, which I suspect is not good for google reputation. Any thoughts anyone Thanks Roger http://www.rogerlapin.co.uk Where they land http://www.google.de/imgres?imgurl=http://www.rogerlapin.co.uk/wp-content/uploads/2013/09/map-of-cornwall.jpg&imgrefurl=http://www.rogerlapin.co.uk/magician-cornwall-magicians-hire-cornwall&h=904&w=1000&sz=167&tbnid=9GFlDv3BTz4ikM:&tbnh=99&tbnw=110&zoom=1&usg=__-b4bUYWREU_wAy2M04LrsrkzZpw=&docid=AUFmzso0arbGDM&sa=X&ei=HLZ2UpGYDMrY0QWXp4D4Dg&ved=0CEgQ9QEwAw&dur=2958
Technical SEO | | rnperki0 -
Can you be penalised in Google for excessive internal keyword linking?
I have an online shop and 3 blogs (with different topics) all set up on sub-domains (for security reasons, don't want Word Press installed in the same hosting space as my shop in case one gets hacked). I have been on the front page of Google for a keyword, lets say 'widgets' for months now. I have been writing blogs about 'widgets', probably about 1/4 of all my blog posts are linking to the 'widgets' page in my shop. I write maybe 1-2 blogs a week, so it's not excessive. This morning I have woken to fine that the widgets page in my shop has vanished from Google's index. So typing in 'widgets' brings up nothing. It hasn't dropped in the rankings, it's just vanished. A few weeks ago I ranked 3 or 4. Then I dropped to about 6. A couple of days ago, i jumped back up to 5 and now it's vanished. If you type in 'buy widgets', or 'widgets online' or 'widgets australia', I have the #1 spot for all those, but for 'widgets', I just don't exist anymore. Could I have been penalised for writing too many posts and keyword linking internally? They're not keyword stuffed and they're well written. I just don't understand what's happened. Right now I"m freaking out about blogging and putting internal links on my website.
Technical SEO | | sparrowdog0 -
Exclude mobile pages from non mobile Google serps
Hi Everybody I see that a lot of our pages on our mobile shop has started to turn up when i do site:domainname.com on google. As they could potentially compete with the similar non mobile version of the same page, is there some way to exlude the mobile domain in non mobile google result without blocking the mobile version altogether. We use an m.domain.com version for our mobile site.
Technical SEO | | AndersDK0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Google counting numbers of products on category pages - what about pagination ?
Hi there, Whilst checking out the SERPS, as you do, I noticed that where our category page appears, google now seems to be counting the number of products (what it calls items) on the product page and displaying this in the 1st part of the description (see image attached). My problem is we employ pagination, so that our category page will have 15 items on it, then there are paginated results for the rest, with either ?page=2 or page-2/ etc. appended to the URL. Although this is only a minor issue, I was just wondering if there was a way to change the number of products displayed on that page to be the entire number of products in that category, is there a microformat markup or something that can over-ride what google has detected ? Furthermore is this system of pagination effective ? I have considered using javascript pagination, such that all products would be loaded on to the one page but hidden until 'paginated', but I was worried about having hidden elements on the page, and also the impact of load times. Although I think this may solve the problem and display the true number of products in a section! Any help much appreciated, Stuart b4urme.jpg
Technical SEO | | stukerr0