Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My home page URL http://seadwellers.com/ redirects to http://www.seadwellers.com/. Is this a problem?
"The URL http://seadwellers.com/ redirects to http://www.seadwellers.com/. Do you want to crawl http://www.seadwellers.com/ instead?" I was given this when I tried to crawl my home page using MOZ software. I was not aware of this, do not know if it could be a problem concerning any aspect of SEO, etc? :
On-Page Optimization | | sdwellers0 -
Sitemaps Updating
Im using wordpress and I realise that my sitemaps doesnt update itself when i add an additional page on my website, like a blog post. I have to go to (1) setting > xml sitemap setting > click on build sitemap > save changes in wordpress, and then (2) Export the sitemal.xml file it to webmaster tools in google every single time i blog. Am i doing it wrong? i feel that all these should be automatic.
On-Page Optimization | | kevinbp0 -
Duplicate content - Opencart
In my last report I have a lot of duplicate content. Duplicate pages are: http://mysite.com/product/search&filter_tag=Сваров�% http://mysite.com/product/search&filter_tag=бижу http://mysite.com/product/search&filter_tag=бижузо�%8 And a lot of more, starting with -- http://mysite.com/product/search&filter_tag= Any ideas? Maybe I should do something in robots.txt, but please tell me the exact code. Best Regards, Emil
On-Page Optimization | | famozni0 -
Same keyword for almost same content
Hi all! my site deals with a concept called "motivation" in two different categories: motivation for teachers (related to kids) and motivation for parents (related to kids all well). These two categories (in different pages and in different menus) deals with the concept through different perspectives. BUT the keyword to optimize the pages is the same. Due to the structure of the web I've been given I am in this position. I can't redesign the web (I'm not allowed to do it). Any solution related to the keyword? Should I maybe optimize one page with the keyword and in this page have a link to the other not-optimzed page?Any ideas? Thanks in advanced.
On-Page Optimization | | juanmiguelcr0 -
Duplicate Content Again
Hello Good People. I know that this is another duplicate post about duplicate content (boring) but i am going crazy with this.. SeoMoz crawl and other tools tells me that i have a duplicate content between site root and index.html. The site is www.sisic-product.com i am going crazy with this... the server is IIS so cannot use htaccess please help... thanks
On-Page Optimization | | Makumbala0 -
Is there a set length / restriction on ALT tag content?
Something in which is an essential part of any site but I cant for the life of me remember if there is a set / recommended limit to the size if should be or is restricted to.
On-Page Optimization | | SamPenno0 -
Duplicate Page Content Question
This article was published on fastcompany.com on March 19th. http://www.fastcompany.com/magazine/164/designing-facebook It did not receive much traffic, so it was re-posted on Co.Design today (March 27th) where it has received significantly more traffic. http://www.fastcodesign.com/1669366/facebook-agrees-the-secret-to-its-future-success-is-design My question is if google will dock us for reprinting/reusing content on another site (even if it is a sister site within the same company). If they do frown on that, is there a proper way to attribute the content to the source material/site (fastcompany.com)?
On-Page Optimization | | DanAsadorian0 -
Over Optimisation Penalty
It seems, some of my pages are being hit by over optimisation penalties. Now to figure out how to tone it down. The website is for downloading widgets. It's got many great ranks. The pages are dynamic and so titles, h1 are done via variables. At present the title + H1 are 100% match for the keyword. In addition breadcrumbs home > widgets > big widgets > big blue widgets. All of which being pretty much 100% match for target keyword. Also, in a gallery, say you're viewing "big blue widgets" there are tabs that link to "small blue widgets" and "medium blue widgets". Again using 100% match anchor text. None of this was done really for SEO. More for usability. But it appears it's recently negatively effecting SEO. After having a couple of pages, that were ranked number 1, now way at the back of the top 1,000, with un-targeted pages ranking in their place, though to a lesser rank. I am going to mix up the tag so that it doesn't match the <title>. I am going to remove the 100% match anchors in the tabs and make them just "small", "medium" and "large". <br /><br />I don't really want to touch the breadcrumbs, because Google uses those in the results and they make perfect logical sense how they're setup. <br /><br />What other changes would you suggest, to help prevent these pages being filtered? <br /><br />Thanks a lot 🙂 </p></title>
On-Page Optimization | | seo-wanna-bs0