Correct usage of expired pages -410 or not?
-
Hi Mozzes,
We're running a property portal that carries around 200.000 listings in two languages. All listings are updated several times per day and when one of our ads expire we report this via the "410 Gone", and place a link to our users: This ad has expired, click here to search for similar properties.
Looking at our competition I seems that here are many different ways to deal with this, one popular being a 301 to the corresponding search result.
We've tried to get directions from Google on what method they prefere, but as usual dead silence.
Advices are mostly welcome.
-
Matthew,
How would you go about tracking user vs bot traffic on 410 header pages? We see that we got plenty of hits on the pages via Awstats, but no means to measure what sort of traffic these hits really are?
Best
Johan
-
Thanks a lot for that Matthew,
I will look into it, but my gut tells me that we do not get a lot of traffic from these pages. Google visits though, tons, so hopefully the 301s will bring us some more nice juice.
Right after posting I ran into this great post about the subject too http://www.seomoz.org/blog/how-should-you-handle-expired-content
However, few words are mentionend about the 410.
Thanks
Johan
-
Hi Johan,
A 410 response code is perfectly acceptable for expired pages. With a 410, you are communicating that page is "gone" and expired content usually is "gone", so it fits. However, with 410 you are going to see that page fall out of the index and that page will lose traffic (assuming it would get any given that some expired content likely won't get any traffic since it is no longer timely) and, more importantly, lose link value (if you had any links to those pages).
As for 301 redirects, I'd start tracking visits to the 410 expired page and links to the 410 expired pages. How much traffic are you getting? How engaged is that traffic? How many links are there and are they good quality? Links are easy enough to track in OSE and for tracking traffic you can use Google events (http://antezeta.com/news/404-errors-google-analytics).
When I see a lot of links or a lot of traffic (especially traffic that leaves), I've converted a 410 page into a 301 redirect that goes to our best (programmatic) guess. For instance, 301 redirect the user to a search for the properties in a similar location or similar price range (or etc.). What I've often found is that when I get the the user redirected to the best page, I'm more likely to see them engage and use the site. Along with the user benefits, I've also see that help with overall organic performance when there are a lot of links back to these pages.
Hope that helps. Thanks,
Matthew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we change redirects to pass linkjuice to different pages from backlinks? Google's stand?
Hi Moz community, We have employed different pages (topics) at same URLs for years. This has brought different backlinks to same page which has led to non relevancy of backlinks. Now we are planning to redirect some URLs which may improve or drop rankings of certain pages. If we roll back the redirects in case of ranking drop, will there be any negative impact from Google? Does Google notice anything about redirect changes beside just passing pagerank from backlinks? Thanks
Algorithm Updates | | vtmoz0 -
Indexed Pages Increase and Major Drop June 25th and July 16th?
I am seeing information regarding a possible Google algorithm that may have taken place on June 25th...and seeing total number of pages indexed in GSC increase (cool!)...BUT, then on July 16, I'm seeing a consistent drop (BIG DROP) of pages indexed - not only on our site, but several. Does anyone have any insight into this or experiencing the same issue?
Algorithm Updates | | kwilgus0 -
Website dropping from Page 1 pos 5 to no ranking and then back again?
Hi all, We have a very odd occurrence with a client of ours. It should be noted that they had a penalty recently removed about 2 months ago after much work from our company. Recently they started appearing back on Page 1 Google for a semi competitive keyword term. We were very happy with this and so was the client. The the ranking improved with our work to position 5, which was excellent. Unfortunately what has been happening is they have been dropping out of the rankings completely for this semi competitive keyword for a few days and then reappearing in the same position. The client is checking daily and has noticed. I thought this is just a 'hangover' from the Google penalty and perhaps a one off occurrence, but it has happened about 3 or 4 times now and seems to be happening every couple of weeks. Can anyone shed some light on this behavior? I have checked Webmasters Tools and everything is fine. Thanks Jon
Algorithm Updates | | Jon_bangonline0 -
AS we using the keyword related to our link but we are not listed in first page of Google search
AS we using the keyword related to our link but we are not listed in first page of Google search, but our competitors using the same keyword , they are listing in first page. how we can short this problem and get into first page on search
Algorithm Updates | | krisanantha0 -
Privacy page ranking above home page in serps
I'm using OSE to try and get some clues as to why my privacy page would rank higher than my home page. Could anyone help me figure out which metrics to review to rectify the issue? My key word is: Mardi Gras Parade Tickets The url that is ranking is <cite>www.mardigrasparadetickets.com/pages/privacy</cite> I'm happy to be ranking in the top 3 for the keyword, but I'd rather hoped it wouldn't be my privacy page. Any help would be awesome, Cy
Algorithm Updates | | Nola5040 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0