Content relaunch without content duplication
-
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used.
My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example:
1. Rechristening the blog - Change Title to make it attractive
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdatedAlso, change title and set rel=cannoical / 301 permanent URLs.
Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure)
Thanks,
-
Hi there
It sounds like you've got all the basis covered. To avoid any form of duplication, it is important to redirect the old version to the new URL as soon as possible, preferably simultaneously when the new blog is published. That will ensure the two won't be indexed together, plus it makes sense that an updated would be redirected.
There are other methods as well - you could simply remove the old blog post page and return a 404 (which isn't a bad thing), update the canonicals like you say or add a robots instruction to prevent the page from being indexed. But I think a 301 redirect here would be best, as it would pass on any link equity (no matter how small) the previous page did earn, it's one of the quickest methods to remove old pages from the Google index and for continuity's sake I think it makes sense too.
The fact that you're changing the old post in a substantial way might mean that you could potentially keep both versions on your site, but only you'll be the judge of how unique it is. I think, to remove any doubt, you should use the 301.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Content incorrectly being duplicated on microsite
So bear with me here as this is probably a technical issue and i am not that technical. We have a microsite for one of our partner organisations and recently we have detected that content from our main site appearing in the URLs for the microsite - both in search results and then when you click through to the SERP. However, this content does not exist on the actual website at all. Anyone have a possible explanation for this? I have tried searching the web but nothing. I assume there is something in the set up of the microsite that is associating it with the content on the main site.
Technical SEO | | Discovery_SA0 -
Duplicate content issue on Magento platform
We have a lot of duplicate pages (600 urls) on our site (total urls 800) built on the Magento e-commerce platform. We have the same products in a number of different categories that make it easy for people to choose which product suits their needs. If we enable the canonical fix in Magento will it dramatically reduce the number of pages that are indexed. Surely with more pages indexed (even though they are duplicates) we get more search results visibility. I'm new to this particular SEO issue. What do the SEO community have to say on this matter. Do we go ahead with the canonical fix or leave it?
Technical SEO | | PeterDavies0 -
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Duplicate Content for our Advertising Sites Showing in Search Results
Hello, My company has a couple different sites (Magento Stores) for Organic, Adwords and AdCenter purposes.They are mirror sites of each except for phone number, contact form, ect. Here is our organic site: http://www.oxygenconcnetratorstore.com/ Adwords and Adcenter site respectively: http://www.oxygenconcnetratorstore.com/portable/
Technical SEO | | chuck-layton
http://www.oxygenconcnetratorstore.com/oxygen/ The problem is, both the Adwords and AdCenter stores appear in Google SERP when you put in the exact URL. I have "noindex/nofollow" tag on both the advertising sites but they are still showing in search results. I feel we are getting hurt for basically have 3 sites of duplicate content. Is there a reason why the sites would be showing in search results even with the nofollow/index tags?? Any help would be awesome. Thanks. seomoz.jpg0 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0