How do I ensure that colour variant products aren't flagged for being duplicate?
-
I have a site with 12 colour variants of 1 style. How do I ensure that these are not flagged as duplicate content as they currently have been?
-
If for some reason you need all the colors to have their own page (as opposed to putting all of the colors on one page, as Alick300 helpfully suggested) you can use rel=canonical links to choose one page to be the indexed one (the most popular color, perhaps). If you are not familiar with canonicals, here is some information.
-
Hi Ashcastle,
Best option is not creating different page for each variant you can show it on one page only. I'm also using the same on my own website.
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding out of stock product
Hi, My e-commerce based site https://www.giftalove.com/ For SEO, What is the best way to solutions of out of stock product.
Technical SEO | | Packersmove0 -
Is content on widget bar less 'seo important' than main content?
hi, i wonder if content on widget bar less 'seo important' than main content.. i mean, is better to place content and links on main cotent than on wordpress widget bar? What are the pros and cons? tx!
Technical SEO | | Dreamrealemedia0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
What was the Google 'update' on 31st March?
Hi all. I looked back and saw that there was an update shown in 'Search Analytics' in Webmaster Tools a few weeks before the Mobile algorithm update. Not been able to find any mention of it and what it did so thought I'd check in here. ps. Also, this is a 90 day stretch and shows that our rankings have taken a hit since the mobile algorithm update. Interesting stuff (see image below) 4rJMU9e.jpg?1
Technical SEO | | RobFD0 -
Why can't I rank for my brand name?
We are soon to launch a new company in New Zealand called Zing. I have been tasked with the challenge of ranking as highly as possible for anything to do with Zing before launch in February. Zing is in the financial industry so my colleagues thought that it would be a good idea to make a small blog (very small with literally one post) that reviewed other financial lenders. This sight stayed online for a couple of months before it was replaced. The official website is still yet to launch, so as an in between, I asked that we make a splash page with a small competition on it (see here at zing.co.nz). I would have preferred there were more keywords on the website but this was not achieved. I am still pushing for this and am hoping to get a few pages on there in the near future. Instead of getting the keywords on the splash page, I was given permission to start a subdomain, (blog.zing.co.nz). This contains many more common search terms and although its not quite doing the job I would like, the rankings for Zing have started to increase. At the moment, we are ranking number 1 for a few brand related keywords such as zing loans. This is why I feel something is wrong, because we rank number 1 for over 10 similar terms but yet we DO NOT EVEN APPEAR on the search engines at all for Zing. Have we been penalized? Do you have any suggestions at all? Do you think we could have been penalized for the first average blog? Maybe I messed up the swap over? Any help would be hugely appreciated!
Technical SEO | | Startupfactory0 -
Dealing with duplicate content
Manufacturer product website (product.com) has an associated direct online store (buyproduct.com). the online store has much duplicate content such as product detail pages and key article pages such as technical/scientific data is duplicated on both sites. What are some ways to lessen the duplicate content here? product.com ranks #1 for several key keywords so penalties can't be too bad and buyproduct.com is moving its way up the SERPS for similar terms. Ideally I'd like to combine the sites into one, but not in the budget right away. Any thoughts?
Technical SEO | | Timmmmy0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Sitemap for pages that aren't on menus
I have a site that has pages that has a large number, about 3,000, pages that have static URLs, but no internal links and are not connected to the menu. The pages are pulled up through a user-initiated selection process that builds the URL as they make their selections, but,as I said, the pages already exist with static URLs. The question: should the sitemap for this site include these 3,000 static URLs? There is very little opportunity to optimize the pages in any serious kind of way, if you feel that makes a difference. There is also no chance that a crawler is going to find its way to these pages through the natural flow of the site. There isn't a single link to any of these pages anywhere on the site. Help?
Technical SEO | | RockitSEO0