What to do about similar product pages on major retail site
-
Hi all,
I have a dilemma and I'm hoping the community can guide me in the right direction. We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site"). The company has 55 million products for one brand, and 37 million for another.
The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc.
If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page.
The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials. We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder.
Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.)
MY QUESTION IS THIS:
If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site?
I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top. Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain?
Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs.
What do you all think?
Ryan
-
Hi Ryan,
I guess the first point here is that Google doesn't treat this sort of filtering as "penalisation"; it's just filtering two or more versions of the same content because it believes (sometimes mistakenly) that users don't need to see two versions of the same thing. This gets REALLY tricky in fields like real estate when all the aggregators in the same town have access to pretty much the same feeds or properties.
If Google were perfect, you'd put up the two pieces of identical content for all 55 millions products, and Google would serve the right one given the appropriate query, like the example above ("fridge sale san antonio" brings up the local page; "refrigerator" has your main site rank). And this might happen, because Google is getting better at these sort of query-appropriate results. We still recommend not providing dupe content solely because we can't be sure that Google will get it right.
As an aside, it would be so great if they worked on a tool for localisation in the same way that they have given us the href lang tag for internationalisation. rel="city" or similar would be awesome, especially for big countries.
Your idea about serving the content from a shared source will certainly work (iframe, text hosted on separate URL, JS etc.). The pages serving this text clearly won't be credited with that text's content, which removes its SEO value of course.
-
Hi Jane, thanks for the response!
I can't understand why Google or any other search engine would penalize a brand for having the same product detail in more than one location on the same root domain. It's just not feasible to re-write all of the product descriptions for 55 million products. The only difference is going to be the price, and some localized content on the page in terms of store locations and addresses (perhaps multiple in one area).
What if - kind of like your M&S example - the local product pages pulled product descriptions from another location on the site, but displayed them in a modal window - so a JS event displayed the proper descriptions and details for the user experience, but the HTML is devoid of any "duplicate" product description content?
-
Hi Ryan,
It's going to be hard to do this without creating duplicates - if they aren't commissioning re-writes of descriptions but just pulling from the database, identical content like this is far from ideal.
One school of thought is that there really isn't any such thing as a "duplicate content penalty" unless you have some huge, gratuitous problem that results in a Panda issue. Google simply chooses the version of the content it favours and drops the other. The local site would still be much more relevant for a query like "fridge sale san antonio".
An example of a big retailer that has a similar(ish) site at the moment is Marks & Spencer Outlet here in the UK (outlet.marksandspencer.com). M&S is probably the most recognisable high street brand in the UK, to give you a perspective on size.
Looking at what they're doing, they're listing pages like this: http://outlet.marksandspencer.com/Limited-Edition-Jacquard-Textured-T69-1604J-S/dp/B00IIP7GY2?field_availability=-1&field_browse=1698309031&id=Limited+Edition+Jacquard+Textured+T69-1604J-S&ie=UTF8&refinementHistory=subjectbin%2Csize_name%2Ccolor_map%2Cbrandtextbin%2Cprice&searchNodeID=1698309031&searchPage=1&searchRank=-product_site_launch_date&searchSize=12
This is the same product as this: http://www.marksandspencer.com/jacquard-textured-coat-with-wool/p/p60056127. I love it that the "outlet" version is more expensive... anyway...
The product details, which are all included in the HTML of the main site, are not included in the Outlet page. The Outlet URL is indexed (what queries it ranks for / could potentially rank for are unknown) - but I would be keen to hypothesise / experiment with the idea that if that product was on a page about it only being available at M&S Moorgate, and looking for coats at M&S Moorgate was as popular a query as [fridge sale location], the Outlet page would rank.
You will never get an SEO to say that you should "copy and paste" descriptions across domains or within them, but essentially the pages have to provide a service / information that makes them worth ranking for relevant queries.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A single page from site not ranking
Hello, We have a new site launched in March, that is ranking well in search for all of the pages, except one and we don't know why. This page it is optimised exactly the same way like the others, but still doesn't rank in Google. We have verified robots.txt for noffollow, noindex tags, we have verified if it was penalized by Google, but still didn't find nothing. Initially we had another site and was on the topic of this page, but we have redirected it to the new one. In case this old site was anytime in the past penalized by Google, could it be possible that the new page be influenced by this? Also, we have another site that ranks on the first position, that targets the same keywords like the page that does not rank. It was the first site we launched, so it is pretty much old, but we do not have duplicate content on them. Maybe Google doesn't like the fact that both target the same keywords and chooses to display only the old site? Please help us if you have any ideas or have been through such thing. Thank you!
Intermediate & Advanced SEO | | daniela.pirlogea0 -
SEO implication of adding large number of new product pages
If I have an eCommerce website containing 10,000 product pages and then I add 10,000 new product pages using a bulk upload (with limited/basic but unique content), does this pose any SEO risk? I am obviously aware of the risks of adding a large number of low quality content to the website, which is not the case here, however what I am trying to ascertain is whether simply doubling the number of pages in itself causes any risk to our SEO efforts? Does it flag to the Search Engines that something "spammy" is happening (even if its not)
Intermediate & Advanced SEO | | DHS_SH0 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Product Page rankings - How to boost?
Hi folks I am responsible for an e-commerce website. Our website is doing very well but I believe that our product pages should be ranking more highly than they currently are. When taking over my current role, it became clear that a number of changes would need to be made to try and boost the under performing product pages. Amongst other things I therefore implemented the following: New Product content - we have placed a massive focus on reworking all product content so that it is unique and offers value to the reader. The new content includes videos, images and text that is all keyword rich but (I hope) not seen as overly spammy. Duplicate content - the CMS was creating multiple versions of the same page - I addressed this by implementing 301 redirects and adding canonical links. This ensures there is now only 1 version of the page Parameters - I instructed Google to not index certain URLs containing specific parameters Internal links - I have tried to improve the number of links to the products from relevant key category pages My question is, although some of the changes have only been in place for a month, what else can I do to ensure that the product pages rank as highly as possible. As an e-commerce website with so many products it is very difficult to link to these product pages directly, so any tips or suggestions would be welcome! Here's an example of a product page link : http://www.directheatingsupplies.co.uk/pid_37440/100180/Worcester-Greenstar-29CDi-Classic-Gas-Combi-Boiler-7738100216-29-Cdi.aspx
Intermediate & Advanced SEO | | DHS_SH0 -
Similar sites on same IP address
Hello, A client has a small number (3) of large price comparison sites which have been launched on separate subdomains - BUT all on the same hosting IP address. The roll out of the sites was not ideal from an SEO perspective - as basically cloned versions of the sites were initially launched and indexed - and are only now being customised i.e. unique content added to each of the category and sub category pages. The first site initially got some traffic - and so did the 2nd in the early days - but then they both bombed (especially number 2). So we think there has probably been some kind of slap / sandboxing. We are starting to see some very early signs of recovery now some months after. My questions is - would it be a wise move to migrate each of the sites to a separate IP address as we start to evolve and optimise each site. Or are they ok to be left on the same hosting / IP address? The sites in question are : shop.deliaonline.com shop.ivillage.co.uk rewards.bestforfilm.com Thanks in advance for your help. Richard
Intermediate & Advanced SEO | | RichBestSEO0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0 -
Where is the best place for Landing Pages to reside on the Home Page?
On this site http://www.austintenantadvisors.com/ I have my main landing pages listed in the navigation under "Types". The reason why I did this is because I am not sure where to insert those on the home page where it does not look spammy to Google and looks natural for users. Obviously they need to appear somewhere on the home page for Google to be able to continue crawling and indexing them. Any thoughts or suggestions would be appreciated.
Intermediate & Advanced SEO | | webestate0