Best way to remove duplicate content with categories?
-
I have duplicate content for all of the products I sell on my website due to categories and subcategories.
Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php
http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php
http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php
Above are 3 urls to the same title and content. I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access.
What is the best to get rid of this duplicate content. Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect? Any suggestions?
Thanks
-
I agree with Nathan on the canonical tag. You could also work with your developer back end system and look into configuration issues and see if there's a way for the application to always have a consistent product level URL. It's the same product. It's being exposed in multiple categories or multiple levels of categories. Regardless, it's the same product. There's no reason for them to be unique product URLs. See if you can work with them to get rid of the root cause.
If not, canonical is definitely the way to go. Unfortunately, there's isn't another way like robots.txt exclusions etc that should be considered.
-
We had the same problem with our site. Finding the best performing or preferred page of the 3 versions is where you should have your developer place the canonical. Then the other two pages should 301 to that page.
When we did this on our site, it cleaned things up and helped in rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
Old Sub domain removal and deletion of content
There are two questions here. I have waited for over 2-3 weeks now and they are still not resolved till now. An old sub-domain is still indexed on Google (blog.nirogam.com) of which all pages have been redirected or 404'd to main domain. There is no webmasters, no authority of this old sub-domain. Hosting of the same might be there. (this has been deleted and does not exist - we own main domain only) How do I de-index and remove them for good? _(Around ~1,000 pages)_I am trying this public tool - any better approaches?Even after removing pages and submission on the tool, 600 pages are still indexed after 2-3 weeks! We deleted a lot of thin content/duplicate pages from the domain (nirogam.com) in Wordpress - All these pages are still in Google's index. They are in Trash folder now. This is causing an increase in 404s in the webmasters etcI have served a 410 header (using wordpress plugin) on all these pages as these should not be directed to anything. However, Google does not always fully understand 410 properly and it still shows up in webmasters as read in this detailed post.All these pages are still indexed.How do I de-index these pages? Any other approach to stop the 404s and remove these pages for good?Any feedback/approach will be highly appreciated.
Intermediate & Advanced SEO | | pks3330 -
How to avoid duplicate content with e-commerce and multiple stores?
We are currently developing an e-commerce platform that will feed multiple stores. Each store will have its own domain and URL, but all stores will offer products that come from the same centralized database. That means all products will have the same image, description and title across all stores. What would be the best practice to avoid getting stores penalized for duplicate content?
Intermediate & Advanced SEO | | Agence_Bunji0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0