Best practices for temporary articles
-
Hello,
I would like to have expert inputs about the best way to manage temporary content?
In my case, I've a page (ex : mydomain.com/agenda) where I have listing of temporary article, with a lifetime of 1 month to 6 months for some of them.
My articles also have a specific url like for ex : mydomain.com/agenda/12-02-2011/thenameofmyarticle/
As you can guess, I got hundreds of 404
I'm already using canonical tag, should I use a in the listing page? I'm a bit lost here..
-
Thanks you Egol
-
thanks Richard.
I'm going to try this.
-
Thanks Aran!
-
{script to test page URL}
$location = "http://www.YourSite.com/";
header("HTTP/1.1 301 Moved Permanently");
header("Location: {$location}");
exit;
}
-
We have temporary content and evergreen content.
When a page of temporary content is created it is filed in a folder according to its "expiration date". On that date the folder is 301 redirected to an appropriate destination. However, before the redirect is done we run analytics on the folder to see if any files are pulling traffic from SERPs or links from other websites. We then try to create evergreen content on the same topic that will capture that traffic and redirect the specific files to the new evergreen content.
-
It seems so unnatural to want to actually remove content when we spend so long striving to create awesome content!
-
You can use the meta robots tags as you mentioned in your question, this will prevent search engines indexing the pages, unfortunately we need to tackle the human side of the issue,if anyone links to the article, then eventually the link will result in a 404 page.
There is nothing wrong with a 404 page, they serve an imporant purpose. Since your articles are not around very long and not being indexed by search engines I see no reason to simply leave the 404 in place.
Ensure you have a custom 404 which is an imformative and helpful resource rather than a simple 404 Page not found message. use the 404 to direct the visitor to a category level page which is related to the topic of the article. Offer a simple list of links to various parts of the site that may be of interest.
Check out the SEOmoz articles
www.seomoz.org/blog/personalizing-your-404-error-pages
www.seomoz.org/blog/are-404-pages-always-bad-for-seo
Hope this helps.
-
I agree with Aran, setup an archive system that keeps the articles under the same URL but does not show them live on the website.
Alternatively you could setup a dumping "archive" folder where you drop all old articles in and use this link as your rel canonical link
-
Hello Arcanis,
Yes we have a destination URL for these contents, I just don't know how I can manage it when it disappears...
-
Hello Aran,
Thanks for your answers!
Unfortunately no, since the content is very "dated" (ex : 3 days music festival, etc.), we don't keep archive of this kind of content.
-
If you are using canonical tag, what is the context for that tag? do you already have a destination URL for these temporary articles?
-
Would it be possible to 'Archive' articles after the 1-6month period ?
Archive could just be a database flag that keeps the articles from appearing in Article index thus keeping the same url, but not clogging up main site with hundreds of links to expired articles?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to create robots.txt for my website
How I can create robots.txt file for my website guitarcontrol.com ? It is having login and Guitar lessons.
Technical SEO | | zoe.wilson170 -
How can you best use additional domains with important keywords
Currently I have a corporate website that is ranking all right. However, I have some additional domains containing import search terms that I would like to use to get higher rankings for the corporate website, or allow these domains to generate more traffic for the corporate website. What are best practice in using these domains with keyword terms, to make most use of them, for ideally both ranking as well as generating additional traffic. All input is highly appreciated.
Technical SEO | | moojoo0 -
I am looking for the best post and page title template
There seems to be a few schools of thought as to how the title templates in the yoast plugin. currently mine are set with the default templates posts title template (%%title%% %%page%% %%sep%% %%sitename%%) page title template (%%title%% %%page%% %%sep%% %%sitename%%) What are the best options here for proper SEO. I am learning as much as i can but i have searched for a concrete answer here on the net, but found many different responses. What do you all think? What is my best option?
Technical SEO | | donsilvernail0 -
What is the best way of inserting keywords into a social networking site
We have a social networking site called Yookos (www.yookos.com). As per the norm, many of the pages of the site carry user generated content and can only be accessed by the users themselves. i created a keyword list that our technical person the uploaded onto the site but SEOmoz shows that these keywords cannot be picked up.
Technical SEO | | seoworx123
What is the best way to make sure that our site is also picked up via search engine searches.0 -
Best strategy for category filtering links eg by colour
Hi All, I hope you can help with some basic on page seo questions! I have an ecommerce site which allows users to filter/restrict the view of a category by one or more colours. This is done by appending a querystring value to the url ie to view blue, green and purple widgets the link might be: www.example.com/my-widgets-category/?colors=123,92,64 On each category page is a group of coloured boxes with links to filter by that colour, (only if there are available coloured widgets in that category). Each category has rel=canonical set to be the appropriate unfiltered category url ie: www.example.com/my-widgets-category/ I used to have these colour filter links all nofollowed- but am not sure that this is a good idea. So my questions are: 1/ what are the implications of these colour links that can generate a lot of different urls (as you can keep on adding colours to the filter) and how can i enure that i am not shooting myself in the foot- my customers love it! 2/ I also have page=1 etc appended for paging through results- the canonical url is set in all instnaces to be the plain category page as above- do i need to add the rel=prev and re=next? 3/ all of these links can really bump up my total page link count- at the moment i have colour filtering boxes in my main menu drop downs so that users can filter all the products that exists in all of the nested child categories of top level categories by colour. Should i remove these to reduce my total link count, nofollow them or leave as is? Its a great site feature for users- i just don't want to be shooting myself in the foot unecessarily. Thanks!
Technical SEO | | blessig0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Best practice canonical tags
I WAS WONDERING WHAT THE BESTPRACTICE IS WHEN USING CANONICAL TAGS: or 2:
Technical SEO | | NEWCRAFT0