Update old article or publish new content and redirect old post?
-
Hi all,
I'm targetting a keyword and we used to rank quite good for it. Last couple of months traffic of that keyword (and variations) is going down a bit.
I wrote an extensive new post on the same topic, much more in dept and from 600 to 1800 words covering the same topic.
Is it better to update the old article and mention that it's updated recently, or publish a new post and redirect the old post to the new post?
-
I have lots of articles that are updated with I get new info. Some started out as very short pages with a photo, slowly they have grown to 2000 words and lots of photos, graphs, data, sometimes video.
It is much more efficient to update an article than it is to create a new page and set-up redirects. Those redirects place a small load on the server that can add up over time if you accumulate a lot of them.
-
Thanks a lot!
-
There is absolutely no harm in updating those old articles Joris. It is probably still indexed by Google and allows you to easily add in more up to date data.
There are some nice tips in this article here.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I have duplicate content on my own website?
Hello Moz community, We are an agency providing services to various industries, and among them the hair salon industry. On our website, we have our different service pages in the main menu, as usual. These service pages are general information and apply to any industry.We also have a page on the website that is only intended for the hair salon industry. On this page, we would like to link new service pages: they will be the same services as our “general” services, but specialized for hair salons. My questions relate to duplicate content: Do we have to make the new individual service pages for hair salons with completely different text, even though it’s the same service, in order to avoid having duplicate content? Can we just change a few words from the “general service” page to specifically target hair salons, and somehow avoid Google seeing it as duplicate content? Reminder that these pages will be internal links inside of the hair salon industry page. Thank you in advance for your answers, Gaël
On-Page Optimization | | Gael_Regnault0 -
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
How can I redirect anything after the article url to main article?
Hello everyone, When someone visits my websites article like http://www.website.com/article-title/lol , it give to 404 page error. But when someone http://www.website.com/article-title/ , it shows the article. The word "lol" can be changed to anything. I would like that to be redirected to the main article. Example: Someone visits website.com/article-title/lol, they should be redirected to website.com/article-title/ Is it possible to do so? If yes, please tell me how. Note: I'm using WordPress Thank you
On-Page Optimization | | hakhan2011 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Two different domains with the same content
Hi all, I just figured out that my client has two different domains with the same content, so the site will be penalized by Google: www.piensapiensa.es www.piensapiensa.com Should I do a 301 redirection from one domain to the other one using the htaccess file as usual chosing one of them as canonical? redirect 301 piensapiensa.es www.piensapiensa.com? Thanks.
On-Page Optimization | | juanmiguelcr0 -
Tags on blog post
I have just ran a report on SEOMOZ and it's came back saying I have over 1000 pages on my website with duplicate content, I thought wow that's not good, however when I looked at the report it was counting duplicate content because I have used more than one tag on a blog post, for example the blog post: www.example.com/blog-post/ Has the tags example 1 example 2 example 3 Meaning I have these URLs www.example.com/blog/tag/example-1 www.example.com/blog/tag/example-2 www.example.com/blog/tag/example-3 All 3 URLs above have the same content as www.example.com/blog-post/ because of the tags, is this a problem? Thanks
On-Page Optimization | | Paul781 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
WordPress - optimizing for new keywords on page or post?
I know WordPress is always a little messy with SEO but i have a main question regarding WordPress optimisation for a special keyword. Let's say i have a chocolade blog and have written about all the vendors of chocolade. Now i found a new keyword which i want to optimize my website for. Should i create a 'Page' within WordPress and optimize it for the new keyword + link to some of the post about a relative keyword within this page?
On-Page Optimization | | Amosnet
OR Should i create a blog post and write about the new keyword and just links some of the other relative blog posts? I hope my question is clear.0