Error msg 'Duplicate Page Content', how to fix?
-
Hey guys,
I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content?
I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation).
Any advice would be very helpful!
Cheers,
Peter
-
I dont think the problem is the https vs http here.
Maybe you could share one of the erros with me?
Fredrik
-
Hi Fredrik,
Thanks for the follow up! I got the error msg from my SEOmoz dashboard. I just got it a couple days ago and the first crawl has been completed. I can also see in Google Webmaster a similar error msg. The old pages are on an older SSL secured site, but the domain is still the same.
For example, my old site was https://www.kellanapparel.com and now I don't use the SSL certificate and the website is hosted on a new platform (shopify) but with the same domain minus 's' (so now its http://www.kellanapparel.com). Maybe it's the case that the old SSL certificate is still valid and because the old content is associated on there it's still triggering? Would this make a difference?
Cheers,
Peter
-
Hi Peter
When you say you got the message 'Duplicate Page Content', is this from Google Webmaster Tools?
You mention an old page? Is that still active under different domain?
To avoid the duplicate content error you need to make sure that there is only ONE url for each piece of content. There are many ways of achieving this and here I only suggest a few:
Canonical tags
Try to use Canonical tags where ever you can. This can be implemented on both versions. The orgininal version should have tags pointing to itself. More info on excatly how to do this here:
http://www.seomoz.org/learn-seo/canonicalization
Remove old site from index
Make sure the old system is no longer indexed. Do this using either robots.txt or add a no index no follow metatag.
More on robots.txt here:
http://www.seomoz.org/learn-seo/robotst.xt
301 redirects
Try to use 301 redirects pointing any old page to the new page. You can also use this to point non www pages to always use www or the other way around. If you have old pages on a different system make sure these are not online. If they are still active on different domain, make sure to make 301 redirects to the new page.
More on this here:
http://www.seomoz.org/learn-seo/redirection
You can find more info on how to avoid this error on this post:
http://www.seomoz.org/learn-seo/duplicate-content
Good luck and hope these tips can help you get started.
Fredrik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Brainstorming
Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.
Technical SEO | | slidescamp0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
Duplicating Keywords in Page Title
Hello All, I am currently trying to establish the TITLE tag of my homepage. I am trying to target 2 terms plus my company name. For example, purposes, the two keywords are: Widget Program Widget Software My company Name is: Widget Direct I originally had the title as: Widget Program | Software | Widget Direct My thought was that I didn't want to repeat the word "Widget" too many times. However, the SEOmoz on-page report card keeps telling me I should have the exact keyword in my title tag. In that case it would make the title: Widget Program | Widget Software | Widget Direct Do you think that is better so that I have each keyword in the title or will that result in a penalty because it looks like I'm stuffing the title with the keyword 'widget'? Any insight is greatly appreciated! Thanks!
Technical SEO | | Robert-B0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Why do I get duplicate content errors just for tags I place on blog entries?
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags. Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map. I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data. As seen in GWT: Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states
Technical SEO | | GerryWeitz0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0