Question about duplicate listings on site for product listings.
-
We list products on our site and suspect that we have been hit by Panda as we are duplicating listings across our own site. Not intentionally, we just have multiple pages listings the same content as they fall into multiple categories.
Has anyone else had the same issue and if so how did you deal with it?.. Have you seen a change in results/rankings due to the changes you made?
-
Hello Nick,
It sounds like you're talking about category pages that link to product pages. I do recommend putting unique content on category pages, as they target different phases of a buying cycle (and keywords). For example...
"Blue Widgets" is a top-level category search.
"Blue Widgets for Men" is a category search."Men's V9 Blue Widget" (not the singular use of widget) is most likely a product search.
All three pages types should have their own unique content using keywords specific to that stage of the buying cycle, and displaying content that is useful to that stage of the buying cycle. For example...
"Blue Widgets" ---> Explain how to choose a blue widget. Which sub-category should they check out? Help them decide where to go next. Help them narrow their search.
"Men's Blue Widgets" ---> Explain the difference between brands and perhaps leading models. What are their needs and budget? Which are the high-end, top-of-the-line men's blue widgets, and which are affordable and reliable options, or entry-level widgets? Educate them.
"Men's V9 Blue Widget" ---> Explain the product. What are the features that either set it apart or make it a good value for the money?
The linking strategy can work this way as well. Naturally you're going to be passing pagerank from category pages into sub-categories and into products. This happens naturally on most sites because that is the natural flow of a purchasing path and a logical taxonomy. However you should also be passing pagerank back up if you want to help category pages rank. Enhanced produce descriptions give you an opportunity to do this.
Good luck!
-
Thanks for the detailed response Tom..
We have for example
A = Listings page with brief of all products
linking to B,C,D,E,F,G etc.. which are all specific pages for those products..
The B - G's rank ok for non competitive terms on the page but page's A dont rank so well.
The problem about no indexing A is that I think the other pages will get lost and have importance taken away from them..
Canonical from second pages of A listings yes, but what I was also thinking to do is put some unique content on page A in order to make that page more unique..Any thoughts on that?
-
Hi there
There are a few options here. Both revolve around making sure only version of your product listings is instructed to be indexed, so that you can promote that version via SEO.
The first solution would be to 301 redirect any duplications to one version of your product listings. This is the quickest and most absolute solution to duplicate listings, as it ensures only version of them remains live. However, depending on your CMS, this might not be possible.
The second solution would be to add robots instructions to the duplicate listings that instructs Google not to index those pages. You can do this by either adding code to the page itself or by using a robots.txt file. The code you'd need to add to the tag on the page is - however, if your CMS is producing duplicate URLs but only using one physical page, you may want to avoid this as it could block all versions of the page. In which case, you could add each duplicate URL to your robots.txt file, specifying you'd like to block them from being indexed, such as:
User Agent: *
Disallow: /example-page-1
Disallow: /example-page-2And so on. I like this robots.txt tutorial guide to learn more about how you can do this.
Finally, you could add canonical tags to each page to specify which version of the URL is the one you want to rank and be indexed, which should prevent others from being considered as well. For example, for a product listing page like: http://www.example.com/category-1/product-page-version-1 - if you want that page to be indexed, you will want to add a canonical tag in the of the code that reads:
Now, if that is the only physical web page that exists and your CMS generates additional, duplicate URLs with how it works, you will only need to add that canonical once. However, if other webpages actually exist in your CMS and server, you will need to add the same tag to the other duplicate pages. This basically tells Google and Bing: "Hey, this page is a duplicate, ignore this one and rank this one instead". Moz has a great guide on canonical usage here.
Dr. Pete also wrote a guide guide on canonical usage and how you can use it for products that are pretty much the same, but have slight differences such as sizes, colour etc.
I hope this helps answer your question - remember, the key here is to instruct Google that you don't want them to index the duplicates and that you don't have them to manipulate rankings. You're trying to have only one version indexed, and if others exist outside of redirects, you have instructed which version should be kept and which ones ignored.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting an Entire Site to a Page on Another Site?
So I have a site that I want to shut down http://vowrenewalsmaui.com and redirect to a dedicated Vow Renewals page I am making on this site here: https://simplemauiwedding.net. My main question is: I don't want to lose all the authority of the pages and if I just redirect the site using my domain registrar's 301 redirect it will only redirect the main URL not all of the supporting pages, to my knowledge. How do I not lose all the authority of the supporting pages and still shut down the site and close down my site builder? I know if I leave the site up I can redirect all of the individual pages to corresponding pages on the other site, but I want to be done with it. Just trying to figure out if there is a better way than I know of. The domain is hosted through GoDaddy.
Intermediate & Advanced SEO | | photoseo10 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Few question about SEO
HI guys, I have few questions and I always find good answer here. I tried many SEO companies some very expensive and well known some with medium prices and some from India. I’m not an SEO expert but I always get the same things from SEO companies. They're saying you have to stay with us for few months before you’ll see any results. I completely understand however I don’t see the result on the end.1. What exactly Do I need SEO company for, after I do on page optimisation if they don’t work on proper backlinks. Just letting you know I’m getting content from other people.2. Is there something else which is really important after your page is optimised than backlinks? Or we should fully focus on get backlinks from customers, guest post, sharing on social media etc. to increase our DA and PA?3. Any advice about some individual or company who is good in backlink services?
Intermediate & Advanced SEO | | Lukas-ST
Thank youLukasThanks a lot.Lukas0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
301 or Canonical - Ecommerce Site Question
We are making a change to our Navigation and this includes having to change the URL structure of a few pages of our site. Due to issues with the CMS (that are out of my control) we are unable to keep the current URL structure of two of our highest ranking pages. Our site is an E-commerce Site The Structure is changing from..... www.domain.com/page/highrankingpage <----OLD PAGE RANKED WELL to www.domain.com/category/highrankingpage <----NEW PAGE Generally I would have 301 'd this page but I found out that our Tech team added a Canonical to this page instead....(showing the high ranking page to the Search Engines) and on our site the visitors are able to browse the website getting the new page. BOTH PAGES ARE BASICALLY IDENTICAL (Same Content) http://searchenginewatch.com/sew/how-to/2288690/how-and-when-to-use-301-redirects-vs-canonical# Thoughts?
Intermediate & Advanced SEO | | CMcMullen0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
On-site links
Hi everybody, There's a lot of information about getting sitewide backlinks, but so few about on-site optimization. Is there a maximum of links to put on a page ? Is there a maximum of link that a page should receive ? etc ... ? So, what is the optimal strategy ? And I'm only concerned about on-page and on-site link, not backlinks commming from other sites. Thanks
Intermediate & Advanced SEO | | DavidPilon0 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0