Duplicate page content errors
-
Site just crawled and report shows many duplicate pages but doesn't tell me which ones are dups of each other. For you experienced duplicate page experts, do you have a subscription with copyscape and pay $.05 per test? What is the best way to clear these? Thanks in advance
-
Yes the new Moz Analytics does not give you the page that is conflicting like the old system. I hope they fix this soon.
-
Yes that works, but time consuming to sort through for me. still wondering why I can't get the report to show the dup pages, anyone else have this problem?
-
Nope, no associated links lead to what the pages that are duplicate are. any suggestions? I had my programmer test this and he got the same results.
-
Also, when you click in to the report you're able to export a CSV, "See the full list of issues by downloading your Crawl Diagnostics export file (.csv)." That will help you line things up pretty quickly.
-
Im pretty sure if you click on the duplicate error it lists the pages that are causing the issue.
Edit...yeah just checked and it does!!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi - How do you get rid of duplicate content that was accidentally created on a tag url? For example, when I published a new article, the content was duplicated on: /posts/tag/lead-generation/
the original article was created with: /posts/shippers-looking-for-freight-brokers/ How can I fix this so a new URL is not created every time I add a tag to a new posting?
On-Page Optimization | | treetopgrowthstrategy0 -
Do permanent redirect solve the issue of duplicate content?
Hi, I have a product page on my site as below. www.mysite.com/Main-category/SubCatagory/product-page.html This page was accessible in both ways as below. 1. www.mysite.com/Main-category/SubCatagory/product-page.html 2. www.mysite.com/Main-category/product-page.html This was causing duplicate title issue. So i permanently redirected one to other. But after more than a month and after many crawls, webmaster tools html improvement still shows duplicate title issue. My question is that do permanent redirect solve duplicate content issue or something i am missing here?
On-Page Optimization | | Kashif-Amin0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
Can I add a paragraph of blog content to a product page?
I have a great blog post about one of my products, which means there's some copy in the blog that I would like in the product page. It would constitute perhaps 5% of the copy on the product page if I copied it.
On-Page Optimization | | Brocberry0 -
Small Title Differences cause duplicate errors
My site titles have 3 features in them. Just 1 varies and I am getting duplicate title errors. I'm thinking of moving the different feature from the 2nd position to the start of the title. Doese anyone think this would help? Any other suggestions for a simple fix? Thank You Handcrafter The titles look like this: Green Measuring Cups|Pewter Post|By JohnMiller Green Measuring Cups|Cherry Post|By JohnMiller Green Measuring Cups|Pewter Strip|By JohnMiller
On-Page Optimization | | stephenfishman0 -
Duplicate page
Just getting started and had a question regarding one of the reports. It is telling me that I have duplicate pages but I'm not sure how to resolve that.
On-Page Optimization | | KeylimeSocial0 -
Best Way to check for duplicate pages
With Google's updates we know they want to clean out duplicate content. i have been seeing the same crap spit out even word for word on different sites. Anyway how do you experienced SEO people test for dups on your own site as well as other sites. The only thing i can come up with is paying copyscape 5 cts a test. There has to be other ways. Advise/
On-Page Optimization | | joemas990 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0