Indexing an e-commerce site
-
Hi all,
My client babyblingstreet.com. She sells baby and toddler clothing. Now a lot of the links on her site contain the same products. For instance: if you go to "What's new" you can find those same products in let's say her "Sale Items" link category.
The real problem with this is let's say my client sells a green dress and someone accesses it through the "baby and toddler dresses" category. And let's say this URL has 10 links pointing to it. Now, let's say someone else accesses this same green dress through the "What's new" category. And let's say this particular URL has 10 links pointing to it. Instead of having 20 links pointing to one URL about the green dress, I now have 10 links pointing to one URL and 10 pointing to another URL even though both URLs feature the exact same green dress.
In this particular example I would want to make the URL of the green dress in the "baby and toddler clothing" section be the canonical URL. So that means I would have to use this canonical tag on the green dress URL that's in the "what's new" category and let's say also the "sale items" category. This could get very tedious if my client has 200+ products. So I am wondering if I have to place a canonical tag on every URL that displays the green dress?
More importantly, I would like to know other people's strategies for indexing e-commerce sites that have the same product featured in multiple categories throughout the site.
I hope this makes sense. Thanks for your time.
-
I think you're right. Again, thanks for the well-informed response. I will take a lot of what you have just said into consideration. I also side with you about the duplicate issues. I may be a bit cynical here, but I have always found it hard to believe that Google will ever give us the complete truth.
-
This is one area where I am 99.99999% confident saying that Google's past statements are incorrect and even irresponsible. Panda is, in many ways, an assault on thin content, and duplicates are worse than thin. I've seen many large-scale sites take massive hits from duplicates (as much as 80% traffic loss).
The "myth" is that duplicate content causes a Capital-P Penalty, but Google uses a very narrow and self-serving definition of that term. Duplicate content does not cause a manual penalty and they probably don't consider Panda to be a penalty internally at Google. However, the consequences are very severe.
Even before Panda, I saw cases studies where reducing duplicate content greatly improved rankings. I had a client whose "product" pages (it was an event site) were being filtered out due to massive duplication. Once we fixed the problem, their search traffic tripled over the course of 3 months. This was well before May Day and Panda (2007, if memory serves). Today, it's 10X worse.
When you get into e-commerce, the problem is almost inevitable and needs to be managed. Now, does that mean that you're currently facing ranking issues, Panda, etc.? No, not necessarily. You have less than 2K indexed pages, which is hardly excessive. If each product page has one duplicate, and you know that can't spin out of control, the consequences are limited. Still, you're diluting your ranking ability to some extent. I think it's well worth addressing the problem and being proactive.
-
Thank you for your well-informed response, Peter. You are right, though it is tedious, I still have to do it.
In regards to product duplicates severely harming my client's ability to rank, I am not quite sure if that's true. Google has wrote extensive material about duplicate content and how it's a myth that it affects ranking. I am not quite sure how truthful that is, but here's a link to one of those articles:
http://www.spottedpanda.com/2011/seo-news/confirmed-seo-facts-matt-cutts/
As for not seeing the duplicate product URLs in action, that's simply because the site is ground-floor. I inherited this project about a month and a half-ago from a design company who only built her a beautiful site. They did not optimize one thing for her. What's worse is that they used this heavily, technically involved cart software called ProductCart. The cart uses .Asp technology, which I am not sure if you aware of this, but many servers aren't built anymore to handle this legacy coding format.
The real problem I am facing, per @activitysuper response, is that the link in the cat is the same the as the link in products section. What I am saying is that there's a completely different cat that also has that same product but with a different URL.
You are both right, this is probably something I can remedy on the server-side. I was just merely throwing this out there to determine how other SEOs deal with having the same product in multiple categories.
Thanks for your time.
-
How do you claim to be part of a community when all you offer is criticisms and for that matter complete ignorance? Do me a favor and never waste my time with such an ignorant response again. You know nothing about me, my client or the background of the situation.
-
It may be tedious, but you need to do it, one way or another. Theoretically, these product duplicates could be severely harming your client's ranking ability.
Practically, I'm not seeing much evidence, though, of these duplicate paths or duplicate products in the Google index. I am seeing other duplicate pages, like search results and https: versions of your product pages. You have a few canonicalization issues going on.
Ideally, no matter what category path, you'll land on one URL. The very small usability consequences of the path change (in my experience, at least) are far outweighed by the risks of spinning off dozens of duplicates. As @activitysuper said, there should be a way to do this dynamically - you're changing a couple of templates, not individual product pages.
I would have to see the duplicate product URLs in action, though. I'm not finding that specific problem.
-
You must be able to dynamically code the canonical tags into those 'new products'.
The really question is why have you got 2 pages? Surely you have a link in the cat and a link in the new products section linking to the same page.
-
how do you manage to pick up clients without having an understanding of how to optimise their site? seems a bit odd
-
If you are using Magento Commerce, just select the option in Config.
If you are using something else, then you may need a plugin.
Any eCommerce software should have already run into this problem a couple years ago.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Google favoring old site over new site...
Hi, I started a new site for a client: www.berenjifamilylaw.com. His old site: www.bestfamilylawattorney.com was too loaded up with bad links. Here's the weird part: when you Google: "Los Angeles divorce lawyer" you see the old site come up on the 21st page, but Google doesn't even show the new site (even though it is indexed). It's been about 2 weeks now and no change. Has anyone experienced something like this? If so, what did you do (if anything). Also, I did NOT do a 301 redirect from old to new b/c of spammy links. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
E-commerce category page optimization - filters vs. categories
Hi, We currently have a site where there are several subcategories for every main category. So this means that visitors will have to click through 3-4 subcategories before reaching products that they could have easily found if the site would be using filters on category pages. My question is - if a subcategory page with 4 products is currently a category page (optimized heading, description) and I'd want this category to be available through filters, how do I still keep it optimized for search engines? So under a category "Cleaners", we have all cleaning products. There are 8 "Cable cleaners" under this category. This is currently a subcategory, but I'd just solve this with a filter in the "Cleaners" screen. Not sure what's right from an SEO standpoint here.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Ranking with other pages not index
The site ranks on page 4-5 with other page like privacy, about us, term pages. I encounter this problem allot in the last weeks; this usually occurs after the page sits 1-2 months on page 1 for the terms. I'm thinking of to much use the same anchor as a primary issue. The sites in questions are 1-5 pages microniche sites. Any suggestions is appreciated. Thank You
Intermediate & Advanced SEO | | m3fan0 -
Load balanced Site
Our client ecommerce site load from 3 different servers using load balancing. abc.com : IP: 222.222.222 Abc.com: IP: 111.111.111 For testing purpose 111.111.111 also point to beta.abc.com Now google crawling site beta.abc.com If we block beta.abc.com using robots.txt it will block google bot also , since beta.abc.com is really abc.com I know its confusing but I been trying to figure out. Ofcourse I can ask my dev to remove beta.abc.com make a seperate code and block it using .htaccess
Intermediate & Advanced SEO | | tpt.com0