Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To update or not to update news URLs ?
We manage a huge daily news website in my small country - keeping this a bit mysterious in case competitors are reading 🙂 Our URL structure is www.companyname.com/news/categoryofnews/title-of-article?id=articleid In this hyperreactive news world, title of articles change frequently (may be ten times a day for the main stories). The question we debate is : should we reflect the modification of the title in the URL or not ? Example : "Trump says he wants to ban search engines" would have URL http://www.companyname.com/news/entertainment/Trump-says-he-wants-to-ban-search-engines?id=12345678 Later in the day the title becomes "Trump denies he suggested banning search engines". Should the URL be modified to http://www.companyname.com/news/entertainment/Trump-denies-he-suggested-banning-search-engines?id=12345678 (option A) or not (option B) ? In Google News it makes no difference because of the sitemap, but in Google organic things are different. At present (option B in place), Google apparently doesn't see that the article has been updated, and shows the initial timestamp which is visually (and presumably SEOwise) not good : our new news looks like old news. Modifiying the URL would solve that issue, but could, may be, create another one : the new URL, being considered a new article, would lose, the acquired weight of the previous one in terms of referrals, social trafic and so on. Or not ? What do you think is the best option ? Thanks for your expertise, Yves
On-Page Optimization | | yves678901 -
Help With Duplicated Content
Hi Moz Community, I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated. I have been reading up about Canonicalization and would to verify some details, when using the canonical tag would it be placed in the /mywebpage.html or /mywebpage file? I am having a hard time to sort this out so any help from you SEO experts would be great 🙂 I have also updated my htaccess file with the following Thanks in advance
On-Page Optimization | | finelinewebsolutions0 -
Google cache tool help
This link is for the Ebay Google cache - http://webcache.googleusercontent.com/search?q=cache:www.ebay.com&strip=1 I wanted to do the same for my homepage so I switched out the urls and it worked. When I try to get a different link in there such as mysite.com/category it wont work. I know my pages are indexed. Any ideas why it wont work for other pages?
On-Page Optimization | | EcommerceSite0 -
Posting content from our books to our website
Hello, I am the newly appointed in-house seo person for a small business. The founders of our company have written several books, which we sell. But book sales are a small part of our business. We are considering posting to our website some or all of the content of the books. This content is directly relevant to the existing content of our website and would be available for free to all visitors. 1. Is it likely that the traffic and links to the new book pages would improve the search engine rankings of our existing pages? 2. We already have pdf versions of each book we could post, which are formatted nicely. Should we convert these to html to make them more friendly to search engines? 3. Of course, we would have to split each book into multiple web pages, perhaps one chapter per page. How much content could each new page optimally accommodate? 4. Would it be more valuable from an SEO perspective to post pieces of the books over time in a blog format? Thank you very much for your thoughts!
On-Page Optimization | | nyc-seo0 -
Duplicate content issue
Hello, I got duplicate content issue on my home page : examplesite.com
On-Page Optimization | | digitalkiddie
examplesite.com/index.html Those page urls are with duplicate content. If in index.html i use 301 redirect like that : Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://examplesite.com" );
?> would i loose any page authority ? sorry for the newbie question0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Footer Content
We currently have footer content contained in a single php include file and is included in every page and contains the following: Most recent 3 tweets from our twitter feed Snippets of our 3 most recent blogs posts navigation links to our main pages (essentially the same as our main navigation in the header) Is this good/bad?
On-Page Optimization | | NeilD0 -
Removing old URLs from Google
Hello, I am sure that this question has been asked many times, but I am still not sure what to do about the following: Our site's URL structure has changed a few times in the past few months. Recenty, we have changed our URLs to become more SEO friendly. However, Google has indexed the old URLs as well. To give an example: The following page in our website shows the following URLs in Google Webmaster Tools: Confúcio e Seus Ensinamentos /artigo/68_38/2/as_religioes_iv_confucio_e_seus_ensinamentos//aula/14_6132/vestibular/confucio_e_seus_ensinamentos//aula/1_14_6132/vestibular/confucio_e_seus_ensinamentos//aula/_14_6132/Vestibular/confucio_e_seus_ensinamentos//aula/ensino/confucio_e_seus_ensinamentos/ The correct URL is the last one. What should I do about the other ones? Almost all the pages in our website have this problem. We have redirected the old URLs to the new ones, but is there anything else we should do? We were asking Google to remove them, but Google has informed us that it has reached the limit. Please advise us on waht we should do. We have removed the old sitemap with the old URLs. What else must we do? Thank you very much.
On-Page Optimization | | Tev0