Correct usage of expired pages -410 or not?
-
Hi Mozzes,
We're running a property portal that carries around 200.000 listings in two languages. All listings are updated several times per day and when one of our ads expire we report this via the "410 Gone", and place a link to our users: This ad has expired, click here to search for similar properties.
Looking at our competition I seems that here are many different ways to deal with this, one popular being a 301 to the corresponding search result.
We've tried to get directions from Google on what method they prefere, but as usual dead silence.
Advices are mostly welcome.
-
Matthew,
How would you go about tracking user vs bot traffic on 410 header pages? We see that we got plenty of hits on the pages via Awstats, but no means to measure what sort of traffic these hits really are?
Best
Johan
-
Thanks a lot for that Matthew,
I will look into it, but my gut tells me that we do not get a lot of traffic from these pages. Google visits though, tons, so hopefully the 301s will bring us some more nice juice.
Right after posting I ran into this great post about the subject too http://www.seomoz.org/blog/how-should-you-handle-expired-content
However, few words are mentionend about the 410.
Thanks
Johan
-
Hi Johan,
A 410 response code is perfectly acceptable for expired pages. With a 410, you are communicating that page is "gone" and expired content usually is "gone", so it fits. However, with 410 you are going to see that page fall out of the index and that page will lose traffic (assuming it would get any given that some expired content likely won't get any traffic since it is no longer timely) and, more importantly, lose link value (if you had any links to those pages).
As for 301 redirects, I'd start tracking visits to the 410 expired page and links to the 410 expired pages. How much traffic are you getting? How engaged is that traffic? How many links are there and are they good quality? Links are easy enough to track in OSE and for tracking traffic you can use Google events (http://antezeta.com/news/404-errors-google-analytics).
When I see a lot of links or a lot of traffic (especially traffic that leaves), I've converted a 410 page into a 301 redirect that goes to our best (programmatic) guess. For instance, 301 redirect the user to a search for the properties in a similar location or similar price range (or etc.). What I've often found is that when I get the the user redirected to the best page, I'm more likely to see them engage and use the site. Along with the user benefits, I've also see that help with overall organic performance when there are a lot of links back to these pages.
Hope that helps. Thanks,
Matthew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would there be any benefit to creating multiple pages of the same content to target different titles?
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?
Algorithm Updates | | Evosite10 -
Ctr question with home page and product pages
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Algorithm Updates | | BobAnderson0 -
Clicks are the ultimate factor to stick the page on position?
Hi all, We know many factors contribute to make a page rank at (top) position like somewhere in top 5 results. I have seen some of our pages suddenly spike to that positions and locked there. They been receiving clicks too. Will they be dropped if they don't get estimated clicks? I think many factors contribute to make a page rank higher but clicks are the one factor which makes the page consistently rank at its best position. What do you say? Thanks
Algorithm Updates | | vtmoz0 -
Best Moz article on landing pages?
From what I understand, building landing pages to link back to sites is a thing of the past. I am looking for a good article that explains best current landing page practices (post Panda and Penquin). Any suggestions?
Algorithm Updates | | cschwartzel0 -
Privacy page ranking above home page in serps
I'm using OSE to try and get some clues as to why my privacy page would rank higher than my home page. Could anyone help me figure out which metrics to review to rectify the issue? My key word is: Mardi Gras Parade Tickets The url that is ranking is <cite>www.mardigrasparadetickets.com/pages/privacy</cite> I'm happy to be ranking in the top 3 for the keyword, but I'd rather hoped it wouldn't be my privacy page. Any help would be awesome, Cy
Algorithm Updates | | Nola5040 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Too many page links?`
Hi there This blog insert was flag suggesting there was too many page links? I cant identify the same problem? Can anyone explain?
Algorithm Updates | | footballfriends0 -
Why would my product pages no longer be indexed in Google?
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index? If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed. I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index http://www.ebacdirect.com
Algorithm Updates | | ebacltd0