What to do about similar product pages on major retail site
-
Hi all,
I have a dilemma and I'm hoping the community can guide me in the right direction. We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site"). The company has 55 million products for one brand, and 37 million for another.
The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc.
If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page.
The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials. We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder.
Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.)
MY QUESTION IS THIS:
If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site?
I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top. Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain?
Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs.
What do you all think?
Ryan
-
Hi Ryan,
I guess the first point here is that Google doesn't treat this sort of filtering as "penalisation"; it's just filtering two or more versions of the same content because it believes (sometimes mistakenly) that users don't need to see two versions of the same thing. This gets REALLY tricky in fields like real estate when all the aggregators in the same town have access to pretty much the same feeds or properties.
If Google were perfect, you'd put up the two pieces of identical content for all 55 millions products, and Google would serve the right one given the appropriate query, like the example above ("fridge sale san antonio" brings up the local page; "refrigerator" has your main site rank). And this might happen, because Google is getting better at these sort of query-appropriate results. We still recommend not providing dupe content solely because we can't be sure that Google will get it right.
As an aside, it would be so great if they worked on a tool for localisation in the same way that they have given us the href lang tag for internationalisation. rel="city" or similar would be awesome, especially for big countries.
Your idea about serving the content from a shared source will certainly work (iframe, text hosted on separate URL, JS etc.). The pages serving this text clearly won't be credited with that text's content, which removes its SEO value of course.
-
Hi Jane, thanks for the response!
I can't understand why Google or any other search engine would penalize a brand for having the same product detail in more than one location on the same root domain. It's just not feasible to re-write all of the product descriptions for 55 million products. The only difference is going to be the price, and some localized content on the page in terms of store locations and addresses (perhaps multiple in one area).
What if - kind of like your M&S example - the local product pages pulled product descriptions from another location on the site, but displayed them in a modal window - so a JS event displayed the proper descriptions and details for the user experience, but the HTML is devoid of any "duplicate" product description content?
-
Hi Ryan,
It's going to be hard to do this without creating duplicates - if they aren't commissioning re-writes of descriptions but just pulling from the database, identical content like this is far from ideal.
One school of thought is that there really isn't any such thing as a "duplicate content penalty" unless you have some huge, gratuitous problem that results in a Panda issue. Google simply chooses the version of the content it favours and drops the other. The local site would still be much more relevant for a query like "fridge sale san antonio".
An example of a big retailer that has a similar(ish) site at the moment is Marks & Spencer Outlet here in the UK (outlet.marksandspencer.com). M&S is probably the most recognisable high street brand in the UK, to give you a perspective on size.
Looking at what they're doing, they're listing pages like this: http://outlet.marksandspencer.com/Limited-Edition-Jacquard-Textured-T69-1604J-S/dp/B00IIP7GY2?field_availability=-1&field_browse=1698309031&id=Limited+Edition+Jacquard+Textured+T69-1604J-S&ie=UTF8&refinementHistory=subjectbin%2Csize_name%2Ccolor_map%2Cbrandtextbin%2Cprice&searchNodeID=1698309031&searchPage=1&searchRank=-product_site_launch_date&searchSize=12
This is the same product as this: http://www.marksandspencer.com/jacquard-textured-coat-with-wool/p/p60056127. I love it that the "outlet" version is more expensive... anyway...
The product details, which are all included in the HTML of the main site, are not included in the Outlet page. The Outlet URL is indexed (what queries it ranks for / could potentially rank for are unknown) - but I would be keen to hypothesise / experiment with the idea that if that product was on a page about it only being available at M&S Moorgate, and looking for coats at M&S Moorgate was as popular a query as [fridge sale location], the Outlet page would rank.
You will never get an SEO to say that you should "copy and paste" descriptions across domains or within them, but essentially the pages have to provide a service / information that makes them worth ranking for relevant queries.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Making Filtered Search Results Pages Crawlable on an eCommerce Site
Hi Moz Community! Most of the category & sub-category pages on one of our client's ecommerce site are actually filtered internal search results pages. They can configure their CMS for these filtered cat/sub-cat pages to have unique meta titles & meta descriptions, but currently they can't apply custom H1s, URLs or breadcrumbs to filtered pages. We're debating whether 2 out of 5 areas for keyword optimization is enough for Google to crawl these pages and rank them for the keywords they are being optimized for, or if we really need three or more areas covered on these pages as well to make them truly crawlable (i.e. custom H1s, URLs and/or breadcrumbs)…what do you think? Thank you for your time & support, community!
Intermediate & Advanced SEO | | accpar0 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
Consolidating numerous landing pages using similar search terms
Hi, The site I am working on currently uses numerous pages for search terms with similar keywords. vehicle wrapping / vehicle wraps / car wrapping / car wraps / van wrapping / van wraps etc Now obviously i want to bring these into one to help create one high authority page covering all terms. At present the "car wraps" page is ranking for quite a few of these terms. Am i best to stick with this or chose the highest search term being car wrapping, and pass the dribbles of juice from the rest and "car wraps" onto this? This is aimed at a local demographic so the local terms will be thrown in too unless you think the places pages will work in favour? Many thanks,
Intermediate & Advanced SEO | | Lee4dcm0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
I have a general site for my insurance agency. Should I create niche sites too?
I work with several insurance agencies and I get this questions several times each month. Most agencies offer personal and business insurance and in a certain geographic location. I recommend creating a quality general agency site but would they have more success creating other nice sites as well? For example, a niche site about home insurance and one about auto insurance. What would your recommendation be?
Intermediate & Advanced SEO | | lagunaitech1 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Optimal site structure for travel site
Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris
Intermediate & Advanced SEO | | sembseo0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
Intermediate & Advanced SEO | | Melia0