What to do about similar product pages on major retail site
-
Hi all,
I have a dilemma and I'm hoping the community can guide me in the right direction. We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site"). The company has 55 million products for one brand, and 37 million for another.
The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc.
If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page.
The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials. We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder.
Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.)
MY QUESTION IS THIS:
If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site?
I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top. Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain?
Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs.
What do you all think?
Ryan
-
Hi Ryan,
I guess the first point here is that Google doesn't treat this sort of filtering as "penalisation"; it's just filtering two or more versions of the same content because it believes (sometimes mistakenly) that users don't need to see two versions of the same thing. This gets REALLY tricky in fields like real estate when all the aggregators in the same town have access to pretty much the same feeds or properties.
If Google were perfect, you'd put up the two pieces of identical content for all 55 millions products, and Google would serve the right one given the appropriate query, like the example above ("fridge sale san antonio" brings up the local page; "refrigerator" has your main site rank). And this might happen, because Google is getting better at these sort of query-appropriate results. We still recommend not providing dupe content solely because we can't be sure that Google will get it right.
As an aside, it would be so great if they worked on a tool for localisation in the same way that they have given us the href lang tag for internationalisation. rel="city" or similar would be awesome, especially for big countries.
Your idea about serving the content from a shared source will certainly work (iframe, text hosted on separate URL, JS etc.). The pages serving this text clearly won't be credited with that text's content, which removes its SEO value of course.
-
Hi Jane, thanks for the response!
I can't understand why Google or any other search engine would penalize a brand for having the same product detail in more than one location on the same root domain. It's just not feasible to re-write all of the product descriptions for 55 million products. The only difference is going to be the price, and some localized content on the page in terms of store locations and addresses (perhaps multiple in one area).
What if - kind of like your M&S example - the local product pages pulled product descriptions from another location on the site, but displayed them in a modal window - so a JS event displayed the proper descriptions and details for the user experience, but the HTML is devoid of any "duplicate" product description content?
-
Hi Ryan,
It's going to be hard to do this without creating duplicates - if they aren't commissioning re-writes of descriptions but just pulling from the database, identical content like this is far from ideal.
One school of thought is that there really isn't any such thing as a "duplicate content penalty" unless you have some huge, gratuitous problem that results in a Panda issue. Google simply chooses the version of the content it favours and drops the other. The local site would still be much more relevant for a query like "fridge sale san antonio".
An example of a big retailer that has a similar(ish) site at the moment is Marks & Spencer Outlet here in the UK (outlet.marksandspencer.com). M&S is probably the most recognisable high street brand in the UK, to give you a perspective on size.
Looking at what they're doing, they're listing pages like this: http://outlet.marksandspencer.com/Limited-Edition-Jacquard-Textured-T69-1604J-S/dp/B00IIP7GY2?field_availability=-1&field_browse=1698309031&id=Limited+Edition+Jacquard+Textured+T69-1604J-S&ie=UTF8&refinementHistory=subjectbin%2Csize_name%2Ccolor_map%2Cbrandtextbin%2Cprice&searchNodeID=1698309031&searchPage=1&searchRank=-product_site_launch_date&searchSize=12
This is the same product as this: http://www.marksandspencer.com/jacquard-textured-coat-with-wool/p/p60056127. I love it that the "outlet" version is more expensive... anyway...
The product details, which are all included in the HTML of the main site, are not included in the Outlet page. The Outlet URL is indexed (what queries it ranks for / could potentially rank for are unknown) - but I would be keen to hypothesise / experiment with the idea that if that product was on a page about it only being available at M&S Moorgate, and looking for coats at M&S Moorgate was as popular a query as [fridge sale location], the Outlet page would rank.
You will never get an SEO to say that you should "copy and paste" descriptions across domains or within them, but essentially the pages have to provide a service / information that makes them worth ranking for relevant queries.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
SEO implication of adding large number of new product pages
If I have an eCommerce website containing 10,000 product pages and then I add 10,000 new product pages using a bulk upload (with limited/basic but unique content), does this pose any SEO risk? I am obviously aware of the risks of adding a large number of low quality content to the website, which is not the case here, however what I am trying to ascertain is whether simply doubling the number of pages in itself causes any risk to our SEO efforts? Does it flag to the Search Engines that something "spammy" is happening (even if its not)
Intermediate & Advanced SEO | | DHS_SH0 -
Google de-indexed a page on my site
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms). Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week. As a test I added a similar page to one of my other sites and it ranks fine. I've checked webmaster tools and there is nothing of note there. I'm not really sure what to do at this stage. Any advice would me much appreciated!
Intermediate & Advanced SEO | | deelo5550 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Should I use the main keyword in the title tag for the site on all category pages?
I am pretty excited about changing all my title tags (for the most important 7 pages) since I have seen my rankings jump up in the SERP just by adding the main keyword for my website in the title tag. To make it easier I will explain my business. Simply, I run an online jewelry shop, so basically the keywords I want to use is "Jewelry online" and for the main categories "Necklace", "Rings" and "Bracelets". What I am unsure about is whether to use all the keywords in the main pages title tag or should I just use the main keyword "Jewelry online". I don’t want to create competition between my own pages of course. Jewelry Online - Trendy Fashion Jewelry | Homepage Or Jewelry Online - Necklace, Rings, Bracelets | Homepage And the same goes for the main categories, should I include "jewelry online" or not, like: Bracelets - Fashion Jewelry Online | Homepage Or Bracelets - Trendy_ Bangles_ and Arm Cuffs | Homepage Any suggestions what is the best practice for the title tag on main page and the main categories? Thanks
Intermediate & Advanced SEO | | ikomorin0 -
Should "View All Products" be the canonical page?
We currently have "view 12" as the default setting when someone arrives to www.mysite.com/subcategory-page.aspx. We have been advised to change the default to "view all products" and make that the canonical page to ensure all of our products get indexed. My concern is that doing this will increase the page load time and possibly hurt rankings. Does it make sense to change all our our subcategory pages to show all the products when someone visits the page? Most sites seem to have a smaller number of products as the default.
Intermediate & Advanced SEO | | pbhatt0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610