Website Redesign - Duplicate Content?
-
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
-
Hi Camilo,
thanks for the clarification. As this content wil not be available anymore on the "old" pages it will only exist on the newly created blog pages there will be no duplicate content issues. These newly created blog pages with the content removed from "old" pages will start from scratch for ranking. and the "old" pages could loose some ranking because you reduced the onpage ranking.
This is not per se bad as it is part of your strategy to improve on conversions (which should be the most important kpi anyway).
You could help these new blog pages a bit by linking to them from the "old" page.
-
Hello Ramon and James,
Sorry for the confusion.
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> current site (accessible directly homepage though main navigation)
will also be available on the new site, with the same URL.
The reason why i post this question is, on the new site, i will be reducing the content on this page (and others) by 80%. That 80% worth of content from the current pages in question, will not exists in the new site. It will not be on the new site because all the content on some pages have been considered to be too exhausting for clients to read and convert to a lead. We have created the new website, same urls, with leaner content to drive more conversions. Less overwhelming to site visitors.
For example, on this page: https://servicechampions.com/air-conditioning/
there are two segments under subheadings: What is an air conditioning system? and another called: How an air conditioner works.
Both of these segments will not be in the new website.Well, i want to republish the lost content as new blogs. My question was, since the soon to be lost content is currently indexed in google, if i republish them as a new set of blogs after the new site goes live, so will Google see the new blogs as duplicate content since it already has been indexed by them?
-
Hi Camilo,
now i am really confused, if you want to maintain the 80% pages and you said they will remain the same url when you make them accessible in your blog.
so
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> current site (accessible directly homepage though main navigation)
https://servicechampions.com/air-conditioning/air-conditioning-installation-and-replacement/ --> new site (accessible through blog)
than why would you need canonical at all and why would you loose rank (besides some lost due to the fact the page is not linked directly from the homepage but from the blog)?
Maybe i am missing something
-
Hi Camilo,
Interesting as google states "Only include critic reviews that have been directly produced by your site..." on https://developers.google.com/search/docs/data-types/reviews#local-business-reviews.
I can only imagine they didn´t realize these reviews were not produced directly by your site because of the way you implemented them in the footer.
-
Thank you Roman for your response.
I forgot to realize that Google will recalculate the rank on the new page. My concern is that the new page rank (although keeping the same URL) will loose some ranking.
If i create a new blog post with the content that is not used in the new page (same page url), and i use a cannonical tag on the new blog post linking to the redesigned page, will the new blog post be indexed and possibly out rank the page i am transferring content from?
-
Hello Ramon,
Interesting that you mention the schema markup. It will be on the new site as well. Just yesterday I guided the redesign company to include such markup. Last week, i did receive a google message though my webmaster tools (console) stating that i had potentially spammy code. So what i did, was i added the reviews (from Yelp, Google, Facebook, BBB) to the footer of my website. They were not on the site prior to last week. Once i added them, i filed a reconsideration request explaining what i did and why i did it, google responded saying they approve and they removed the manual action. So once again, our website displays star ratings on the SERPs. see attached image. They were showing prior to last week manual action. then they were removed on i received the manual action. after i added the reviews to the site's footer, and filed a reconsideration request, the manual action was removed and the ratings re-appeared in the SERPs.
The new site will keep all its urls. They will not change. Just the content on a few core pages. So i am gathering that it is ok to make the content that will be deleted into a blog.
-
Hello James,
I appreciate your two cents, greatly. I too am not a huge fan of the new site, but I am giving it a shot. Our current site is content heavy and ranks well for our terms. The change in design is geared to converting more leads (calls and forms). So I am giving this change a shot which is aimed to a less technical audience. People looking to fix their ac. I just hope that the new site still keeps its rankings. All urls will remain the same.
-
If some page is useful for your users or audience you dont delete that page.
In your case is not your desicion, so you have 2 alternatives,1-Redirect those pages to another page with similar content (it has to be a better content than the orginal)
2-The other option is add a canonical tag, basically you will trasfer the authority of the old page to the new one.But there is one factor that you need to keep in mind, that factor is the URL. If your new page will use the URL of the old page, there is no reason
to keep live the old pages because from the google perspective you would be replaced the old pages.Example
https://servicechampions.com/air-conditioning/ ----> Old Page with old content
https://servicechampions.com/air-conditioning/ ----> New Page with New contentFrom the Google perspective your new page is replacing the old one, so Google need to recalculate the rank of the page (links, content, ux ect).
So no matter, if your republish the content in your blog and then you add some cannonical tags
Example
https://servicechampions.com/blog/air-conditioning/ ----> Old Page with old contentTo Google the constant parameter is the URL if change it change the ecuation. My advices, Dont change the URL structure, keep the same URL structure and add some improvements.
Example
https://servicechampions.com/air-conditioning/ ----> Old Page with old content
https://servicechampions.com/service-air-conditioning/ ----> New Page with New contentAnd then to avoid duplicate content issues add the canonical tag the older page so in that way
you will trasfer the authority from the old to the new pageRead this article will help you a lot
A Step-by-Step Guide to Updating Your Website Without Destroying Your SEO
-
Hi,
As a general rule its fundemental to maintain pages that are relevant for your audience and generate organic traffic so i would say yes its a good idea to republish as a blog. Furthermore because i see that a big part of these pages (though i don´t know exactly which 80% you will loose) are pages that are a perfect fit for a blog, like how to and informational articles.
Would be good to maintain the same url´s to avoid redirects but depending on the cms being used that might proof more difficult. At least maintain meta data and redirects with 301´s.
I also saw you were using third party reviews in schema markup on your current site (but not,yet, on your new site) and this is not a good idea as this is against google´s guidelines (more on this here http://searchengineland.com/google-updates-local-reviews-schema-guidelines-257745)
Success with your new site
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Duplicate content issue - online retail site.
Hello Mozzers, just looked at a website and just about every product page (there are hundreds - yikes!) is duplicated like this at end of each url (see below). Surely this is a serious case of duplicate content? Any idea why a web developer would do this? Thanks in advance! Luke prod=company-081
Intermediate & Advanced SEO | | McTaggart
prod=company-081&cat=20 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Can you be penalized by a development server with duplicate content?
I developed a site for another company late last year and after a few months of seo done by them they were getting good rankings for hundreds of keywords. When penguin hit they seemed to benefit and had many top 3 rankings. Then their rankings dropped one day early May. Site is still indexed and they still rank for their domain. After some digging they found the development server had a copy of the site (not 100% duplicate). We neglected to hide the site from the crawlers, although there were no links built and we hadn't done any optimization like meta descriptions etc. The company was justifiably upset. We contacted Google and let them know the site should not have been indexed, and asked they reconsider any penalties that may have been placed on the original site. We have not heard back from them as yet. I am wondering if this really was the cause of the penalty though. Here are a few more facts: Rankings built during late March / April on an aged domain with a site that went live in December. Between April 14-16 they lost about 250 links, mostly from one domain. They acquired those links about a month before. They went from 0 to 1130 links between Dec and April, then back to around 870 currently According to ahrefs.com they went from 5 ranked keywords in March to 200 in April to 800 in May, now down to 500 and dropping (I believe their data lags by at least a couple of weeks). So the bottom line is this site appeared to have suddenly ranked well for about a month then got hit with a penalty and are not in top 10 pages for most keywords anymore. I would love to hear any opinions on whether a duplicate site that had no links could be the cause of this penalty? I have read there is no such thing as a duplicate content penalty per se. I am of the (amateur) opinion that it may have had more to do with the quick sudden rise in the rankings triggering something. Thanks in advance.
Intermediate & Advanced SEO | | rmsmall0 -
Duplicate content ramifications for country TLDs
We have a .com site here in the US that is ranking well for targeted phrases. The client is expanding its sales force into India and South Africa. They want to duplicate the site entirely, twice. Once for each country. I'm not well-versed in international SEO. Will this cause a duplicate content filter? Would google.co.in and google.co.za look at google.com's index for duplication? Thanks. Long time lurker, first time question poster.
Intermediate & Advanced SEO | | Alter_Imaging0 -
Should I robots block site directories with primarily duplicate content?
Our site, CareerBliss.com, primarily offers unique content in the form of company reviews and exclusive salary information. As a means of driving revenue, we also have a lot of job listings in ouir /jobs/ directory, as well as educational resources (/career-tools/education/) in our. The bulk of this information are feeds, which exist on other websites (duplicate). Does it make sense to go ahead and robots block these portions of our site? My thinking is in doing so, it will help reallocate our site authority helping the /salary/ and /company-reviews/ pages rank higher, and this is where most of the people are finding our site via search anyways. ie. http://www.careerbliss.com/jobs/cisco-systems-jobs-812156/ http://www.careerbliss.com/jobs/jobs-near-you/?l=irvine%2c+ca&landing=true http://www.careerbliss.com/career-tools/education/education-teaching-category-5/
Intermediate & Advanced SEO | | CareerBliss0