Similar category names result in similar urls and duplicate anchor texts
-
Hi all,
I'm working on an e-commerce website about car tuning and car parts.
There are main categories like ( Aerodynamics, Power tuning, Interior, Wheels, Tires, etc. ) and in the products are organized in sub-categories representing the product manufacturer, car manufacturer and car model + modification. Unfortunately this kind of structure creates duplicate sub-category names. For example we can have parts for Audi A4 8K in Aerodynamics and ABT, and the same time we can have Power tuning from the same manufacturer and for the same car, or Sport brakes for the same car by different manufacturers.
So here are how some links look-like:
/alfa-romeo-147-c1070-en
/alfa-romeo-147-c234-en
/alfa-romeo-147-c399-en
These are totally different categories, with the same anchor text and almost the same url addresses ( the only difference in the urls is the category id ).
Can this be affecting the site's indexation, and which can be the better way to create the internal link structure ?
-
Hi Aran,
thanks for the fast response.
Here's more detailed information about the sub-categories:
1st Category
Performance > Chip Tuning & Power Box > Power Box - Diesel Engines > Alfa Romeo 147
url - /alfa-romeo-147-c1070-en
2nd Category
Aerodynamics > Rieger Tuning > Alfa Romeo 147
url - /alfa-romeo-147-c234-en
3rd Category
Lighting > Tail Lights > Alfa Romeo 147
url - /alfa-romeo-147-c399-en
The url represents the name of the subcategory with it's category id and the language.
I was thinking of changing only the url, but the urls will become much much longer, and this will not help with the problem with the anchor texts and the keyword cannibalisation ...
-
You'll probably find that you'll get keyword cannibalisation with multiple pages all jockeying for the same Key Phrases.
Possibly a big and risky job, but could you not rewrite the URLs to include the category name rather than cat id?
/Alfa-romeo-147-sport-brakes-en
Without seeing the site and checking out the current structure its hard to say exactly I would structure it. Can you post a link?
Cheers
Aran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can we analyze about duplication?
Howdy all, We have a few pages being hailed as copies by the google search comfort. Notwithstanding, we accept the substance on these pages is unmistakably extraordinary (for instance, they have totally unique list items returned, various headings and so on) An illustration of two pages google discover to be copies is underneath. in the event that anybody can spot what may be causing the copy issue here, would especially see the value in ideas! Much appreciated ahead of time.
Technical SEO | | camerpon090 -
WP URL issue - Concatenated URLs (LOTS of them)
WP is doing this somehow, and creating URLs for hundreds of pages that don't exist. HOW is this happening, and how do I stop It? I have many, many URLS like this: https://www.atouchofrust.com/terms-of-use/atouchofrust.com/vendor-news. Of note, atouchofrust.com/terms-of-use, and atouchofrust.com/vendor-news are both legit pages on the site. Why they are being concatenated is beyond my limited understanding of WP. Please, somebody, help. Cori
Technical SEO | | FlyingC0 -
Value in Consolidating Similar Sites / Duplicate Content for Different URLs
We have 5 ecommerce sites: one company site with all products, and then four product-specific sites with relevant URL titles and products divided up between them (www.companysite.com, www.product1.com, www.product2.com, etc). We're thinking of consolidating the smaller sites into our most successful site (www.product1.com) in order to save management time and money, even though I hate to lose the product-specific URLs in search results. Is this a wise move? If we proceed, all of the products will be available on both our company site and our most successful site (www.company.com & www.product1.com). This would unfortunately give us two sites of duplicate content, since the products will have the same pictures, descriptions, etc. The only difference would be the URL. Would we face penalties from Google, even though it would make sense to continue to carry our products on our company site?
Technical SEO | | versare0 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Wordpress Category Archives
Wordpress question here. Can anyone tell me if there is an SEO advantage to creating a page filtered to show results from an individual category as opposed to simply linking to the category archive? The content is identical in both cases.
Technical SEO | | waynekolenchuk0 -
If two links from one page link to another, how can I get the second link's anchor text to count?
I am working on an e-commerce site and on the category pages each of the product listings link to the product page twice. The first is an image link and then the second is the product name. I want to get the anchor text of the second link to count. If I no-follow the image link will that help at all? If not is there a way to do this?
Technical SEO | | JordanJudson0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0 -
Duplicate META Description
Two of my urls point to the same site. For example: 123.com and OneTwoThree.com. Because the pages have two separate url's and the same content, my Google Webmaster Tools account is showing this as Duplicate Meta Descriptions. Is there a way around this?
Technical SEO | | BradBorst0