What to do about similar product pages on major retail site
-
Hi all,
I have a dilemma and I'm hoping the community can guide me in the right direction. We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site"). The company has 55 million products for one brand, and 37 million for another.
The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc.
If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page.
The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials. We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder.
Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.)
MY QUESTION IS THIS:
If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site?
I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top. Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain?
Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs.
What do you all think?
Ryan
-
Hi Ryan,
I guess the first point here is that Google doesn't treat this sort of filtering as "penalisation"; it's just filtering two or more versions of the same content because it believes (sometimes mistakenly) that users don't need to see two versions of the same thing. This gets REALLY tricky in fields like real estate when all the aggregators in the same town have access to pretty much the same feeds or properties.
If Google were perfect, you'd put up the two pieces of identical content for all 55 millions products, and Google would serve the right one given the appropriate query, like the example above ("fridge sale san antonio" brings up the local page; "refrigerator" has your main site rank). And this might happen, because Google is getting better at these sort of query-appropriate results. We still recommend not providing dupe content solely because we can't be sure that Google will get it right.
As an aside, it would be so great if they worked on a tool for localisation in the same way that they have given us the href lang tag for internationalisation. rel="city" or similar would be awesome, especially for big countries.
Your idea about serving the content from a shared source will certainly work (iframe, text hosted on separate URL, JS etc.). The pages serving this text clearly won't be credited with that text's content, which removes its SEO value of course.
-
Hi Jane, thanks for the response!
I can't understand why Google or any other search engine would penalize a brand for having the same product detail in more than one location on the same root domain. It's just not feasible to re-write all of the product descriptions for 55 million products. The only difference is going to be the price, and some localized content on the page in terms of store locations and addresses (perhaps multiple in one area).
What if - kind of like your M&S example - the local product pages pulled product descriptions from another location on the site, but displayed them in a modal window - so a JS event displayed the proper descriptions and details for the user experience, but the HTML is devoid of any "duplicate" product description content?
-
Hi Ryan,
It's going to be hard to do this without creating duplicates - if they aren't commissioning re-writes of descriptions but just pulling from the database, identical content like this is far from ideal.
One school of thought is that there really isn't any such thing as a "duplicate content penalty" unless you have some huge, gratuitous problem that results in a Panda issue. Google simply chooses the version of the content it favours and drops the other. The local site would still be much more relevant for a query like "fridge sale san antonio".
An example of a big retailer that has a similar(ish) site at the moment is Marks & Spencer Outlet here in the UK (outlet.marksandspencer.com). M&S is probably the most recognisable high street brand in the UK, to give you a perspective on size.
Looking at what they're doing, they're listing pages like this: http://outlet.marksandspencer.com/Limited-Edition-Jacquard-Textured-T69-1604J-S/dp/B00IIP7GY2?field_availability=-1&field_browse=1698309031&id=Limited+Edition+Jacquard+Textured+T69-1604J-S&ie=UTF8&refinementHistory=subjectbin%2Csize_name%2Ccolor_map%2Cbrandtextbin%2Cprice&searchNodeID=1698309031&searchPage=1&searchRank=-product_site_launch_date&searchSize=12
This is the same product as this: http://www.marksandspencer.com/jacquard-textured-coat-with-wool/p/p60056127. I love it that the "outlet" version is more expensive... anyway...
The product details, which are all included in the HTML of the main site, are not included in the Outlet page. The Outlet URL is indexed (what queries it ranks for / could potentially rank for are unknown) - but I would be keen to hypothesise / experiment with the idea that if that product was on a page about it only being available at M&S Moorgate, and looking for coats at M&S Moorgate was as popular a query as [fridge sale location], the Outlet page would rank.
You will never get an SEO to say that you should "copy and paste" descriptions across domains or within them, but essentially the pages have to provide a service / information that makes them worth ranking for relevant queries.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
When serving a 410 for page gone, should I serve an error page?
I'm removing a bunch of old & rubbish pages and was going to serve 410 to tell google they're gone (my understanding is it'll get them out of the index a bit quicker than a 404). I should still serve an error page though, right? Similar to a 404. That doesn't muddy the "gone" message that I'm giving Google? There's no need to 410 and die?
Intermediate & Advanced SEO | | HSDOnline0 -
Weird Site is linking to our site and links appears to be broken
I have got a lot of weird links indexed from this page: http://kzs.uere.info/files/images/dining-table-and-2-upholstered-chairs.html When clicking the link it shows 404. Also, the spam score is huge. What do you guys suggest to do with this?
Intermediate & Advanced SEO | | Miniorek
Could it be done by somebody to get our rankings down or domain penalized? Best Regards
Mike & Alex0 -
How to rank product pages?
Hi guys, Please advice me on something improving my product pages ranking. We are doing well for head terms, categories but not ranking for product pages. We have issues with product pages which I am think is hard to tackle. For instance we have duplicate products (different colors), duplicate content internally (colors) and from manufacturer websites. Product pages linked from sub-category i.e. Home > Category > Sub-Category (20 per page) using pagination for next 20 and so on. Product pages linked internally via widgets that says other Similar products, featured products etc. Another issue with our product pages is that we are using third party reviews platform and whenever users add reviews to product pages this platform creates an hyperlink to different anchors which is not relevant to product. Example - http://goo.gl/NUG652 Can somebody please give some advice on how to improve rankings for product pages. writing unique content for thousands of pages is not possible. Even our competitor not writing unique content.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Duplicate on page content - Product descriptions - Should I Meta NOINDEX?
Hi, Our e-commerce store has a lot of product descriptions duplicated - Some of them are default manufacturer descriptions, some are descriptions because the colour of the product varies - so essentially the same product, just different colour. It is going to take a lot of man hours to get the unique content in place - would a Meta No INDEX on the dupe pages be ok for the moment and then I can lift that once we have unique content in place? I can't 301 or canonicalize these pages, as they are actually individual products in their own right, just dupe descriptions. Thanks, Ben
Intermediate & Advanced SEO | | bjs20101 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0 -
Site #2 beats site #1 in every aspect?
Hey guys, loving SEOMoz so far and will definitely continue my subscription after the free trial. I have a question however, which I am really confused about. When researching my primary keyword, I have found that the second ranked site beats the top site in every single aspect, apart from domain age, which is almost 6 years for the top one and 6 months for the second. When I say every single aspect, I mean everything. More authority for the page and domain, more links, more anchor text links, more authoritive links, more social signals, more relevant links, better domain (although second ranked site is a .net), better MozRank, better MozTrust etc.... I have noticed though, that in the UK SERPs, those sites are switched, so #2 is actually #1. Could it be that the US SERPs just haven't updated yet, or am I missing something completely different.
Intermediate & Advanced SEO | | darrenspeed1 -
How Fast Is Too Fast to Increase Page Volume of Your Site
I am working on a project that is basically a site to list apartment for rent (similar to apartments.com or rent.com). We want to add a bunch of amenity pages, price pages, etc. Basically increasing the page count on the site and helping users be able to have more pages relevant to their searches and long tail phrases. So an example page would be Denver apartments with a pool would be one page and Seattle apartments under 900 would be another page etc. By doing this we will take the site from about 14,000 pages or so to over 2 million by the time we add a list of amenities to every city in the US. My question is should I worry about time release on them? Meaning do you think we would get penalized for launching that many pages overnight or over the course of a week? How fast is too fast to increase the content on your site? The site about a year old and we are not trying to scam anything just looking to site functionality and page volume. Any advice?
Intermediate & Advanced SEO | | ioV0