Website Redesign - Duplicate Content?
-
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
-
Hi Camilo,
thanks for the clarification. As this content wil not be available anymore on the "old" pages it will only exist on the newly created blog pages there will be no duplicate content issues. These newly created blog pages with the content removed from "old" pages will start from scratch for ranking. and the "old" pages could loose some ranking because you reduced the onpage ranking.
This is not per se bad as it is part of your strategy to improve on conversions (which should be the most important kpi anyway).
You could help these new blog pages a bit by linking to them from the "old" page.
-
Hello Ramon and James,
Sorry for the confusion.
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> current site (accessible directly homepage though main navigation)
will also be available on the new site, with the same URL.
The reason why i post this question is, on the new site, i will be reducing the content on this page (and others) by 80%. That 80% worth of content from the current pages in question, will not exists in the new site. It will not be on the new site because all the content on some pages have been considered to be too exhausting for clients to read and convert to a lead. We have created the new website, same urls, with leaner content to drive more conversions. Less overwhelming to site visitors.
For example, on this page: https://servicechampions.com/air-conditioning/
there are two segments under subheadings: What is an air conditioning system? and another called: How an air conditioner works.
Both of these segments will not be in the new website.Well, i want to republish the lost content as new blogs. My question was, since the soon to be lost content is currently indexed in google, if i republish them as a new set of blogs after the new site goes live, so will Google see the new blogs as duplicate content since it already has been indexed by them?
-
Hi Camilo,
now i am really confused, if you want to maintain the 80% pages and you said they will remain the same url when you make them accessible in your blog.
so
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> current site (accessible directly homepage though main navigation)
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> new site (accessible through blog)
than why would you need canonical at all and why would you loose rank (besides some lost due to the fact the page is not linked directly from the homepage but from the blog)?
Maybe i am missing something
-
Hi Camilo,
Interesting as google states "Only include critic reviews that have been directly produced by your site..." on https://developers.google.com/search/docs/data-types/reviews#local-business-reviews.
I can only imagine they didn´t realize these reviews were not produced directly by your site because of the way you implemented them in the footer.
-
Thank you Roman for your response.
I forgot to realize that Google will recalculate the rank on the new page. My concern is that the new page rank (although keeping the same URL) will loose some ranking.
If i create a new blog post with the content that is not used in the new page (same page url), and i use a cannonical tag on the new blog post linking to the redesigned page, will the new blog post be indexed and possibly out rank the page i am transferring content from?
-
Hello Ramon,
Interesting that you mention the schema markup. It will be on the new site as well. Just yesterday I guided the redesign company to include such markup. Last week, i did receive a google message though my webmaster tools (console) stating that i had potentially spammy code. So what i did, was i added the reviews (from Yelp, Google, Facebook, BBB) to the footer of my website. They were not on the site prior to last week. Once i added them, i filed a reconsideration request explaining what i did and why i did it, google responded saying they approve and they removed the manual action. So once again, our website displays star ratings on the SERPs. see attached image. They were showing prior to last week manual action. then they were removed on i received the manual action. after i added the reviews to the site's footer, and filed a reconsideration request, the manual action was removed and the ratings re-appeared in the SERPs.
The new site will keep all its urls. They will not change. Just the content on a few core pages. So i am gathering that it is ok to make the content that will be deleted into a blog.
-
Hello James,
I appreciate your two cents, greatly. I too am not a huge fan of the new site, but I am giving it a shot. Our current site is content heavy and ranks well for our terms. The change in design is geared to converting more leads (calls and forms). So I am giving this change a shot which is aimed to a less technical audience. People looking to fix their ac. I just hope that the new site still keeps its rankings. All urls will remain the same.
-
If some page is useful for your users or audience you dont delete that page.
In your case is not your desicion, so you have 2 alternatives,1-Redirect those pages to another page with similar content (it has to be a better content than the orginal)
2-The other option is add a canonical tag, basically you will trasfer the authority of the old page to the new one.But there is one factor that you need to keep in mind, that factor is the URL. If your new page will use the URL of the old page, there is no reason
to keep live the old pages because from the google perspective you would be replaced the old pages.Example
https://servicechampions.com/air-conditioning/ ----> Old Page with old content
https://servicechampions.com/air-conditioning/ ----> New Page with New contentFrom the Google perspective your new page is replacing the old one, so Google need to recalculate the rank of the page (links, content, ux ect).
So no matter, if your republish the content in your blog and then you add some cannonical tags
Example
https://servicechampions.com/blog/air-conditioning/ ----> Old Page with old contentTo Google the constant parameter is the URL if change it change the ecuation. My advices, Dont change the URL structure, keep the same URL structure and add some improvements.
Example
https://servicechampions.com/air-conditioning/ ----> Old Page with old content
https://servicechampions.com/service-air-conditioning/ ----> New Page with New contentAnd then to avoid duplicate content issues add the canonical tag the older page so in that way
you will trasfer the authority from the old to the new pageRead this article will help you a lot
A Step-by-Step Guide to Updating Your Website Without Destroying Your SEO
-
Hi,
As a general rule its fundemental to maintain pages that are relevant for your audience and generate organic traffic so i would say yes its a good idea to republish as a blog. Furthermore because i see that a big part of these pages (though i don´t know exactly which 80% you will loose) are pages that are a perfect fit for a blog, like how to and informational articles.
Would be good to maintain the same url´s to avoid redirects but depending on the cms being used that might proof more difficult. At least maintain meta data and redirects with 301´s.
I also saw you were using third party reviews in schema markup on your current site (but not,yet, on your new site) and this is not a good idea as this is against google´s guidelines (more on this here http://searchengineland.com/google-updates-local-reviews-schema-guidelines-257745)
Success with your new site
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Duplicate Content Through Sorting
I have a website that sells images. When you search you're given a page like this: http://www.andertoons.com/search-cartoons/santa/ I also give users the option to resort results by date, views and rating like this: http://www.andertoons.com/search-cartoons/santa/byrating/ I've seen in SEOmoz that Google might see these as duplicate content, but it's a feature I think is useful. How should I address this?
Intermediate & Advanced SEO | | andertoons0