Duplicate Content Dilemma for Category and Brand Pages
-
Hi,
I have a online shop with categories such as:
- Trousers
- Shirts
- Shoes
- etc.
But now I'm having a problem with further development.
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products.How do I deal with this from a duplicate content perspective?
I'm appreciate your suggestions.
Best, Robin
-
Wow. I did some research. I stand corrected. Thanks, Linda.
As far as your categories go, you could have:
www.Domain.com/computers/notebooks/apple-notebooks/
and
www.domain.com/apple-products/
On your category pages, I'd suggest adding unique content at the bottom of the category pages. A paragraph above the fold would help for ranking purposes, but may detract from usability and conversions.
-
Thank you for the time you've invested in this answer. This gives me a good sense on what to do.
I like option 2 the best. But not sure whether I got it right. What do you think of the following scenario (taking the Apple example you provided):
- I have a category page with notebooks. The title, description and text on this category is focusing on notebooks in general. The products include Dell, HP and Apple.
=> This is basically the setup, which I have in my shop right now. - Now I want to create a brand page for Apple. There the title, description and text is focused on Apple in general. The products include Apple Notebooks, iPhones, iPads ect.
Now here's the point: Title, description and text for the notebook category and Apple brand page will be different (unique content). But products are part of content too, aren't they? And since there will be a overlap in products, this would result in duplicate content for featured products.
But I want both pages to rank. One for 'notebook' and the other one for 'apple'.
Is that possible, or are partially overlapping products in those 2 categories a dealbreaker for my SEO?
- I have a category page with notebooks. The title, description and text on this category is focusing on notebooks in general. The products include Dell, HP and Apple.
-
According to Moz: "Another option for dealing with duplicate content is to utilize the rel=canonical tag. The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, and often takes much less development time to implement." http://moz.com/learn/seo/canonicalization
Why do you think it does not pass ranking power?
-
This is a difficult question. I would agree with patrick_g that canonicals are one way to handle duplicate content, but canonicals don't pass link juice to the parent, unless it's through a link.
The canonical tag only tells google which page to index. It does not transfer link juice as does a 301 redirect. Read up on this.
Here are some good choices:
1. if the brands are only for use experience purposes, you could make the pages noindex, follow. This would eliminate the duplicate content issue, and the brands could serve as a link juice hub. They would be kept out of google's index, but would still pass link juice.
2. Create unique content for the brand pages, and give them a title tag and content that differs from the competing page. For example if you already have an "Apple" page, make the new page "Certified Apple Products" (or some other KW).
3. This one requires some programming skill, and is a little controversial. Put the new pages in a parent folder "/hide/" (don't actually use the word "hide"). Put all the new pages in that folder, and disallow the parent folder in robots.txt. Any links on your site to these pages would pass link juice to these pages, which would be lost, and could be a significant link juice drain. Here's the controversial part - put the links to those pages in an iframe, and disallow the iframe folder in robots.txt. This would prevent bots from crawling those links and passing link juice.
-
Hello soralsokal,
I don't have a bunch of products like you, but generally I prefer to use the rel=canocial tag to push the link juice to the one category page I'm trying to get ranked well. So I still build out the various pages I want, but don't expect the duplicate content to be ranked.
I suppose you could also put them in your Robots.txt file to block them from search engines. I've done that with a blog that packed the blog posts in all sort of different categories, thus creating duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Galleries and duplicate content
Hi! I am now studing a website, and I have detected that they are maybe generating duplicate content because of image galleries. When they want to show details of some of their products, they link to a gallery url
Intermediate & Advanced SEO | | teconsite
something like this www.domain.com/en/gallery/slide/101 where you can find the logotype, a full image and a small description. There is a next and a prev button over the slider. The next goes to the next picture www.domain.com/en/gallery/slide/102 and so on. But the next picture is in a different URL!!!! The problem is that they are generating lots of urls with very thin content inside.
The pictures have very good resolution, and they are perfect for google images searchers, so we don't want to use the noindex tag. I thought that maybe it would be best to work with a single url with the whole gallery inside it (for example, the 6 pictures working with a slideshow in the same url ), but as the pictures are very big, the page weight would be greater than 7 Mb. If we keep the pictures working that way (different urls per picture), we will be generating duplicate content each time they want to create a gallery. What is your recommendation? Thank you!0 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Will Creating a Keyword specific Page to replace the Category Section page cause any harm to my website?
I am running a word press install for my blog and recently had 3 of my main keywords set as categories. I recently decided to create a static page for the keywords instead of having the category page showing all the posts within the category, and took it off the navigation bar. I read about setting the categories to use NO index so the search engines can shine more importance on the new pages i created to really replace where the category was showing. Can this have a negative effect on my rankings? http://junkcarsforcashnjcompany.com junk car removal nj is showing the category section, So i placed the no index on it. Will the search engines refresh the data and replace it with the new page I created?
Intermediate & Advanced SEO | | junkcars0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
How much is too much content for a home-page?
Hey guys, I'm looking to implement a strategy where I put a 20,000 word article on my home-page. It won't be a super-long page, this content will be divided into nested tabs. The content will also be found on individual pages (corresponding to the tabs) on the site, but these will have a canonical tag pointing to the home page, Will I get penalized for this kind of structure? Cheers, JC
Intermediate & Advanced SEO | | trx0