Best practices for temporary articles
-
Hello,
I would like to have expert inputs about the best way to manage temporary content?
In my case, I've a page (ex : mydomain.com/agenda) where I have listing of temporary article, with a lifetime of 1 month to 6 months for some of them.
My articles also have a specific url like for ex : mydomain.com/agenda/12-02-2011/thenameofmyarticle/
As you can guess, I got hundreds of 404
I'm already using canonical tag, should I use a in the listing page? I'm a bit lost here..
-
Thanks you Egol
-
thanks Richard.
I'm going to try this.
-
Thanks Aran!
-
{script to test page URL}
$location = "http://www.YourSite.com/";
header("HTTP/1.1 301 Moved Permanently");
header("Location: {$location}");
exit;
}
-
We have temporary content and evergreen content.
When a page of temporary content is created it is filed in a folder according to its "expiration date". On that date the folder is 301 redirected to an appropriate destination. However, before the redirect is done we run analytics on the folder to see if any files are pulling traffic from SERPs or links from other websites. We then try to create evergreen content on the same topic that will capture that traffic and redirect the specific files to the new evergreen content.
-
It seems so unnatural to want to actually remove content when we spend so long striving to create awesome content!
-
You can use the meta robots tags as you mentioned in your question, this will prevent search engines indexing the pages, unfortunately we need to tackle the human side of the issue,if anyone links to the article, then eventually the link will result in a 404 page.
There is nothing wrong with a 404 page, they serve an imporant purpose. Since your articles are not around very long and not being indexed by search engines I see no reason to simply leave the 404 in place.
Ensure you have a custom 404 which is an imformative and helpful resource rather than a simple 404 Page not found message. use the 404 to direct the visitor to a category level page which is related to the topic of the article. Offer a simple list of links to various parts of the site that may be of interest.
Check out the SEOmoz articles
www.seomoz.org/blog/personalizing-your-404-error-pages
www.seomoz.org/blog/are-404-pages-always-bad-for-seo
Hope this helps.
-
I agree with Aran, setup an archive system that keeps the articles under the same URL but does not show them live on the website.
Alternatively you could setup a dumping "archive" folder where you drop all old articles in and use this link as your rel canonical link
-
Hello Arcanis,
Yes we have a destination URL for these contents, I just don't know how I can manage it when it disappears...
-
Hello Aran,
Thanks for your answers!
Unfortunately no, since the content is very "dated" (ex : 3 days music festival, etc.), we don't keep archive of this kind of content.
-
If you are using canonical tag, what is the context for that tag? do you already have a destination URL for these temporary articles?
-
Would it be possible to 'Archive' articles after the 1-6month period ?
Archive could just be a database flag that keeps the articles from appearing in Article index thus keeping the same url, but not clogging up main site with hundreds of links to expired articles?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
URL Path to Store Article Library for SEO
What is the best URL structure for a domain to start adding a main directory for educational articles under? We want to find the best URL structure to keep articles under as a main, then category, then subcategory: For example, if my client is going after TWO SPECIFIC KEYWORD PHRASES as their primary (let's say they are KIDNEY DIALYSIS and CKD DIALYSIS, should they start building up their article library (for educational purposes as well as ranking/SEO), should they list it under which structure for the best SEO ranking opportunities: domain.com/KIDNEY-DIALYSIS/article1, article2, article3 domain.com/CKD-DIALYSIS/article1, article2, article3 domain.com/ARTICLES/article1, article2, article3 Both KIDNEY DIALYSIS and CKD DIALYSIS are critical phrases for them. Is it a waste of URL space to just have a home of domain.com/ARTICLES then build off that URL structure like: domain.com/articles/kidney-dialysis domain.com/articles/ckd-dialysis Thoughts? Ideas? Thank you!
Technical SEO | | ErnieB0 -
Temporary Redirects - Trackback & Feed
Under my MOZ account I'm getting a bunch of temporary redirect warnings. Most of them are blog post with a /feed or a /trackback . I know the trackback URL's are coming from blogs where people have commented because it brings up a Trackback URL | Comments RSS Feed section. I'm not sure how to make this /trackback work. The only line of code in my editor that says trackback is h3#postinfo,
Technical SEO | | jampaper
h3#comments,
h3#respond,
h3#trackbacks,
#respond h3 {
margin: 0;
}0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Best Pracice to Obsolete a Blog
Hi, I have a blog that has thousands of URL, the blog is a part of my site. I would like to obsolete the blog, I think the best choices are 1. 404 Them: Problem is a large number of 404's. I know this is Ok, but makes me hesitant. 2. meta tag no follow no index. This would be great, but the question is they are already indexed. Thoughts? Thanks
Technical SEO | | Bucky0 -
Is Adobe Acrobat the best for making PDF documents in terms of seo and price?
As we add PDF documents to our website, I want to take it up a notch. In terms of seo and software price, is Adobe Acrobat the only choice? Thanks! No Mac here. I should clarify that I can convert files to PDFs with Microsoft Word and add some basic info for the search engines such as title, keywords, author, and links. This article inspired me: www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search I can add links back to the page when I create the PDF, but we also have specific product PDFs that suppliers let us copy and serve from our server--why use their bandwidth. Much as you would stamp your name on a hard copy brochure the vendor supplies, I want to add a link to our page from those PDFs. That makes me think I should ask our supplier to give me a version with a link to our page. Then there is the question: is that ok to do? In the meantime, I will check TriviaChicken's suggestions and dream about a Mac, Allan. Thanks
Technical SEO | | zharriet0