Best Way To Handle Expired Content
-
Hi,
I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available.
The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per).
The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available.
Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site?
Thanks for any insights you can offer.
Matthew -
Hey Sebastian -
We already do something similar to know if it is expired (instead of the if condition in MySQL, we query for records where job_closing_date >= CURDATE()). Thankfully they programmed that in to pull the old job off the list and out of the job search results. (Though up until yesterday the old jobs were on the XML sitemap...woops. Guess what I fixed yesterday!)
I like your idea though of keeping the content active and keeping the page alive, but with some kind of message above there. That would definitely keep the page unique. I'm not positive that will fly on the business side but I'll definitely propose that.
Thanks for the reply!
-
I like that idea of 301 redirecting the page back to the job search page. The search page would certainly be a good introduction and probably satisfy looking for the job. These pages aren't high ranking pages in the SERPs, the traffic is referral traffic from other websites. Give that, so Utah Tiger's question about keywords and search engine wouldn't apply in this website's case. Thanks for the idea!
-
Hi Matthew,
What I would do is to still have it accessible through a direct link, but not through a list of jobs displayed on the main site. I would also include the note at the top of the page saying something like 'This job offer has already expired'.
This way you still have a page, which is unique, does not show on the main jobs list and indicates that it is expired.
I'm not sure how much of the programming knowledge you have and what technology is the site built in, but a simple IF condition in your SQL statement to add specific flag to each record indicating whether it is expired or not would be something like this (this specific one is based on the MySQL syntax):
IF (
CURDATE() BETWEENdate_from
ANDdate_to
,
0,
1
) ASexpired
Then, when you call specific job you simply check whether the 'expired' field is equal 1 - and if so - display the message above the job.
I hope this helps.
-
EGOL..Your technical response is way above me....could you restate in tyro terms.
Is the expired data hidden? Does the 301 redirect go to homepage or job search page or either? What value does it add? Keywords? I guess the pages would still be indexed in order for value to be created or does a 301 redirect just add all the value on the page it is redirected too? I will also go look up 301 redirects right now.
Utah Tiger
-
I have expiring content on one of my sites.
I place all of the postings into folders according to date such as...
mysite.com/postings/2012/02/job-at-mcds/
Then on certain dates I add an htaccess file to the /2012/02/ folder that will 301 redirect all items in that folder to the homepage.
You could 301 the old posts to a job search page or some other type of page that will introduce the visitor to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection loop. Best way to resolve...
Hi Guys Got a warning on a crawl today "Your page is redirecting to a page that is redirecting to a page that is redirecting to a page... and so on." In GWMT it is set to www. and also back-end in my server. I also have an SSL deployed and in my htaccess the rule is added to ensure all pages got to SSL. Any of you guys have advice regarding the best route to go or should I "IGNORE" this warning as all other aspects are clocking 95%+? Thanks in advance Daren
Technical SEO | | Daren-WebSupportLab0 -
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Best TLD for china
In China there are 2 commonly used tlds .cn and .com.cn. We own both versions for a new domain. Does anyone know if there is research done which one is the best TLD "in the eyes" of the search engines Baidu and Google? Or maybe there is a methodology to select the best? Thanks!
Technical SEO | | Paul-G0 -
Best way to handle different views of the same page?
Say I have a page: mydomain.com/page But I also have different views: /?sort=alpha /print-version /?session_ID=2892 etc. All same content, more or less. Should the subsequent pages have ROBOTS meta tag with noindex? Should I use canonical? Both? Thanks!
Technical SEO | | ChatterBlock0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0