Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Use H2 or H3 for Six Ways to Hip Thrust https://www.fitness-china.com/hip-thrust
Our article about Hip Thrust https://www.fitness-china.com/hip-thrust We have sex tips about how to hip thrust. that use h2. but Final Words use h3 or h2?
On-Page Optimization | | ahislop5740 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Duplicate Content, Same Company?
Hello Moz Community, I am doing work for a company and they have multiple locations. For example, examplenewyork.com, examplesanfrancisco.com, etc. They also have the same content on certain pages within each website. For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)? I hope this is clear. Thanks, Cole
On-Page Optimization | | ColeLusby0 -
Google indexing page differently
Does google index an interal page differently depending on whether you are using a FULL url (including domain) or just a relative link? Also, is it possible that using a full URL (http://mysite.com/page.html) causes the browser to "ping" the server differently than just having the href linked to using relative links (/page.html) Could this cause server or firewall perfomance issues?
On-Page Optimization | | WebRiverGroup0 -
Mobile website content
What is the point of optimizing (on-page SEO) a parallel mobile website if the mobile search results are taken from the general (desktop) index?
On-Page Optimization | | echo10 -
Customer Review Capture / Google Approved Review Sites
We are interested in showing customer reviews on our website (in UK for seo purposes- UGC) and are initially reluctant to use a 'What customer say' or Testimonial page as we think customers may think we have just made these reviews up? I wanted to ask what methods you folks use to capture reviews? If you use 3rd party providers do you have any recommendations? (I found this link but it seems a little outdated as it doesn't include for example eKomi:http://www.seomoz.org/blog/how-to-rank-well-in-google-products-search-a-big-list-of-places-to-get-reviews) Thanks in advance.
On-Page Optimization | | jannkuzel0 -
Original content and the Google Panda Update
We are an online furniture store with about 1300 products on the site, and we mostly use the catalogue descriptions for the product. Recently I have been reading about One Way Furniture: http://ecommerceprnews.com/e-commerce_articles/2011/03/one-way-furniture-shifts-toward-quality-content-after-google-panda-update-201928.htm They are a big american online furniture which seemed to have lost about a 3rd of there traffic due to being punished in the panda update. Now it seems they are blaming the fact they use they use catalogue descriptions for the product (like us), and now they are going to rewrite all their product descriptions. We are a small company and rewriting 1300 products (meaningfully) is no small task. Looking at our own traffic we have taken a small slump since feb after about 18 months of general increased month on month traffic ( bar seasonal dips and boost), but we didn't have a "fall of the cliff" like One Way Furniture. But have been expanding into other areas (and there for new keywords), so we had expected to be increasing our traffic. So the question is, how important is unique content for all our products? is it worth all the time and money to fix all the pages? Our plan is to make sure our category pages (and there for landing pages) have unique content, would that be enough on its own, or are the product pages damaging the site over all?
On-Page Optimization | | eunaneunan0