Best Way to Fix Dupe Content
-
We have some internal pages which we have discovered may be causing a duplicate content problem. Does anyone have a recommendation on the best way to fix this?
Main page:
Dupe pages:
**http://**bit.ly/116uzXe
**http://**bit.ly/WxyyoW
**http://**bit.ly/TNxPVm
http://bit.ly/VMnbuYThanks in advance!
-
You could also make use of robots.txt file to resolve it.
Disallow: /*?sort=
NOTE: Be very careful when blocking search engines. Test and test again!
-
Add canonical tags to your headers:
This helps tell Google which version of your content is the original. You can also tell Google to ignore those query parameters in Google Webmaster Tools:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection loop. Best way to resolve...
Hi Guys Got a warning on a crawl today "Your page is redirecting to a page that is redirecting to a page that is redirecting to a page... and so on." In GWMT it is set to www. and also back-end in my server. I also have an SSL deployed and in my htaccess the rule is added to ensure all pages got to SSL. Any of you guys have advice regarding the best route to go or should I "IGNORE" this warning as all other aspects are clocking 95%+? Thanks in advance Daren
Technical SEO | | Daren-WebSupportLab0 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
No Java, No Content..?
Hello Mozers! 🙂 I have a question for you: I am working on a site and while doing an audit I disabled JavaScript via the Web Developer plugin for Chrome. The result is that instead of seeing the page content, I see the typical “loading circle” but nothing else. I imagine this not a good thing but what does this implies technically from crawler perspective? Thanks
Technical SEO | | Pipistrella0 -
Fixing up poorly performing eCommerce SIte with content? First things first..
I had a small site that was performing well enough for me for 6 years--nice source of extra family income. It was a very out of date platform (1990's) and I moved it to Big Commerce in December. Since then visits and sales have just plummeted. It has died and I am sad. I want to remain positive. It's a viable niche product line, I have tons of quality inventory, my site platform is current. I am a good writer and am willing to add content if it will help. From what I can determine apart from technical issues I may not know about I think I suffer from THIN CONTENT. I want to pretend I'm opening a new business and start over in today's environment. I have had bad experiences with two SEO consultants (one ripped me off financially, the other gave me advice that hurt me) so I have no choice but to educate myself. I got started in ecommerce back in 1998 with the help of many wonderful kind people on forums so I'm turning to this resource once again. What would you do first if you were me especially to get some revenue flowing asap? I need help. I need to know if there is hope or if I should liquidate at the nearest swap meet 😉 Thanks! decorativedishes.net
Technical SEO | | ddktt1 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Duplicate Content Issue
Very strange issue I noticed today. In my SEOMoz Campaigns I noticed thousands of Warnings and Errors! I noticed that any page on my website ending in .php can be duplicated by adding anything you want to the end of the url, which seems to be causing these issues. Ex: Normal URL - www.example.com/testing.php Duplicate URL - www.example.com/testing.php/helloworld The duplicate URL displays the page without the images, but all the text and information is present, duplicating the Normal page. I Also found that many of my PDFs seemed to be getting duplicated burried in directories after directories, which I never ever put in place. Ex: www.example.com/catalog/pdfs/testing.pdf/pdfs/another.pdf/pdfs/more.pdfs/pdfs/ ... when the pdfs are only located in a pdfs directory! I am very confused on how to fix this problem. Maybe with some sort of redirect?
Technical SEO | | hfranz0