Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of keyword/keyphrases density on header/footer
Hi, It might be a stupid question but I prefer to clear things out if it's not a problem: Today I've seen a website where visitors are prompted no less than 5 times per page to "call [their] consultants".
On-Page Optimization | | GhillC
This appears twice on the header, once on the side bar (mouse over pop up), once in the body of most of the pages and once in the footer. So obviously, besides the body of the pages, it appears at least 4 times on every single pages as it's part of the website template. In the past, I never really wondered re the menu, the footer etc as it's usually not hammering the same stuff repeatedly everywhere. Anyway, I then had a look at their blog and, given the average length of their articles, the keyword density around these prompts is about 0.5% to 0.8% for each page. This is huge! So basically my question is as follow: is Google's algorithm smart enough to understand what this is and make abstraction of this "content" to focus on the body of the pages (probably simply focusing on the tags)? Or does it send wrong signals and confuse search engine more than anything else? Reading stuff such as this, I wonder how does it work when this is not navigational or links elements. Thanks,
G Note: I’m purposely not speaking about the UX which is obviously impacted by such a hammering process.0 -
Why will my page description not update??!!
I updated my page title and description along time ago, the title updated but not the description. I updated it in google search console, but it keeps pulling the first few lines from the page content instead, which isn't really an attention grabbing sentence?? Any ideas? Here's the page http://thumannagency.com/personal-insurance/home-insurance This is what my Title/ description comes up as... Homeowners Insurance Quotes Dallas TX | Thumann Agency <cite class="_Rm">thumannagency.com/personal-insurance/home-insurance</cite>When shopping for homeowners insurance in Dallas, TX our local agents will provide you with home insurance quotes from highly rated home insurance ...
On-Page Optimization | | MissThumann0 -
Website Updates
What's considered a safe number when it comes to making on-site seo changes/updates for pages of a website (refreshing copy, changing title tags, optimizing images, etc)? 1 a day? 5 a day? We need to make these changes but we don't want Google to freak out and have it have the opposite effect on our ranking.
On-Page Optimization | | SEOhughesm0 -
Is This A Reason To Move Content?
Dear All, I am questioning my initial decisions when I planned a site due to reading lots of info on moz. Although what I have read has made me question what I have already done, I can't find anything that is specific to my exact case, so here goes. I recently built a shopping cart in OpenCart. I want the site to have lots of information on the products it sells. I have populated each category with at least 1000 words of content that is specific to the products in that category, also I have some information pages that have no products in them at all, just copy. So the shopping site actually has a few pages that look like a static website and a few that look like a normal shopping cart. My thought behind this was I wanted the pages with lots of info to rank and become authoritative, in some way elevating the whole site. I have recently put a blog on the site, and a combination of that, and reading Moz has lead me think that I should move all the content from the category pages to the blog, and deep link each blog post to it's relevant products and category. From what I have read it would be easier to get the blog ranking and acknowledged as an authority rather than 30 category pages. Also each 1500+ word category page will make at least 3-4 nice blog posts, and each post can be focused on a single keyword rather than a large category page that has maybe 3-4 keywords it's trying to rank for. Also the blog is much better optimised than a standard OC category page (even using extensions with them). The only negative I can see is moving the content, but the site is less that 2 months old, and the amount of link juice it has is negligible. Does google cut new sites a bit of slack in these situations of moving content around, or will I be seen as 'up to something' by google? I guess my question is, am I barking up the right tree? Or is the old adage 'a little information is dangerous' true in this case, and I just about to make a load of work for the sake of it with no real benefit. However, if I am to make such a dramatic change to the sites architecture I think the time is now, before things start gaining juice & rank. I hope I have explained my situation clearly and I thank anyone who can offer me any advice. Great forum, Thank you, Ian
On-Page Optimization | | cookie7770 -
Google Custom Search
Hello, Is there anyone use goolge custom search? and will it improve my website ranking?
On-Page Optimization | | JohnHuynh0 -
Software/tool for Content Inventory
Is there any tool out there which can crawl a section of a website (or whole website) and gives a list of urls by subsections? I have tried PowerMapper but it only gives visual map, not a list. Thanks!
On-Page Optimization | | StickyRiceSEO0 -
Duplicate content because of content scrapping - please help
We manage brands websites in a very competitive industry that have thousands of affiliate links We see that more and more websites (mainly affiliates websites) are scrapping our brand websites content and it generate many duplicate content (but most of them link to us back with an affiliate link). Our brand websites still rank for any sentence in brackets you search in Google, Will this duplicate content hurt our brand websites ? If yes, should we take some preventive actions ? We are not able to add ongoing UGC or additional text to all our duplicate content and trying to stop those websites of stealing our content is like playing cat and mouse... Thanks for your advices
On-Page Optimization | | Tit0 -
Duplicate Content Warning
Hi Mozers, I have a question about the duplicate content warnings I am recieving for some of my pages. I noticed that the below pattern of URLs are being flagged as duplicate content. I understand that these are seen as two different pages but I would like to know if this has an negative impact on my SEO? Why is this happening? How do I stop it from happening? http://www.XXXX.com/product1234.html?sef_rewrite=1 http://www.XXXX.com/product1234.html Thanks in advance!
On-Page Optimization | | mozmonkey0