ECommerce site - Duplicate pages problem.
-
We have an eCommerce site with multiple products being displayed on a number of pages.
We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find.
-
Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate.
-
We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?.
General thoughts if anyone has similar problems welcome
-
-
Many thanks , you have been most helpful.
Yes, I see your point. I think we will have a look at implementing this on a couple of categories where we can monitor traffic and rankings . Then if it looks good, then will roll it out to the rest of the site.
Thank you.
Sarah
-
Essentially yes - pages 2+ of search just look "thin" to Google. They tend to have similar title tags, META descriptions, etc., and Google honestly isn't all that fond of indexing search pages in the first place (they don't want their search to land on your search). Those 2+ pages also don't tend to attract links or make a lot of sense for someone landing on them. By using META NOINDEX,FOLLOW, Google can crawl those searches to deeper pages, but the actually search pages don't dilute your overall site and search index.
Google's preferred method (or so they say) in 2012 is rel=prev/next, but I find that implementation can be much trickier than META NOINDEX. It's a difficult topic, and I honestly find that the ideal approach varies wildly from site to site. It's important to plan well, implement careful, and measure the results.
-
Hi Peter,
Many thanks for your answer. Very comprehensive and much appreciated There's certainly some good suggestions here.
Just quickly you mention about putting a NOINDEX FOLLOW on every page from 2 or 3 onwards.I take it , that's because later pages don't rank to well ?.
Is that the suggestion so the idea behind it that the link juice is being diluted to much. By Keeping only the first 2 pages say indexed etc, I would stand a better chance of ranking higher.
I will pass your suggestions on to my developer and see what we can come up from it. Will monitor and report back , hopefully with a sorted solutioin.
Once again , many thanks for sound advice.
Sarah.
-
Unfortunately, pagination + sorts gets ugly fast. Technically, the rel=prev/next tag should contain the sort parameter AND then you should canonical to the main pagination page. So, for example if you had a page like:
www.example.com/search.php?page=2&sort=asc
You should have tags like:
- Rel=Prev: http://www.example.com/search.php?page=1&sort=asc
- Rel=Next: http://www.example.com/search.php?page=3&sort=asc
- Canonical: http://www.example.com/search.php?page=2
In practice, it's incredibly hard to implement. So, you could do a couple of things:
(1) Block the sort_by parameter with Google Webmaster Tools parameter handling
(2) Use META NOINDEX, FOLLOW on all pages 2+ of search and sort URLs
I don't find Robots.txt works that well, in practice, and 800K blocked URLs can make Google jump. I'm actually confused by how Google is crawling the sorts at all (since they're form-driven). It looks like you put the sorts in your pagination links. Would it be possible to store any sorts in a cookie or session variable and not add those to links?
Given your current situation, and that Google has indexed thousands of sort URLs (from what I can see), I think the Google Webmaster Tools approach might be the safest. This is a complex problem, though, and you may need to consult someone.
-
Hi ,
In Answer to your point on to Question 2 , Currently the maximum number of pages we have is 4 pages plus a View All for a few of our products but most products are split on 2 pages plus a view all.
For the largest product example we have 83 products broken down as Page 1 to 4 has 20 products , page5 has 3 and View all - 83 products. rel Prev and rel Next are on the pages and View all has Nothing on it (Is that okay). The title tags are duplicated on the numerous pages , so I was going to add in page 2, 3, 4 etc to sort that.
I was going to increase the number of products per page to 30 , which would in effect put me down to 3 pages plus View all but more importantly , I thought I would also get stronger link value and less dilution hence better SEO .
The pages don't rank partially well at all well but on google speed test, I think we score 85/100 anyway , so from a speed point of view, it should'nt be a problem. Was just worried, that big changes like this could have a dramatic effect .
The url incase your interested is http://www.bestathire.co.uk/rent/Scaffold_towers/266
Many thanks
Very much appreciated.
Sarah.
-
Hi ,
Many Thanks for your reply,
We do have pagination and sorts like listing products a-z , z-a , price low to high and high to low etc which all generate different urls but we have put in the robot.txt file for google not to spider them. See below .
Also from looking at WMT is says it has blocked886,996 url's in the past 90 days. Our site has approx 54,000 indexed pages.
Disallow: */sort_by:Product.price%20ASC
Disallow: */sort_by:Product.price%20DESC
Disallow: */sort_by:Product.title%20ASC
Disallow: */sort_by:Product.title%20DESC
Disallow: */sort_by:Product.distance%20ASC
Disallow: */sort_by:Product.distance%20DESC
Disallow: */stealth:onAre you suggesting we do the Canonical the sorts aas well for saftey incase we have missed anything ?
Sarah
-
(1) DON'T canonical to the first page of results - Google definitely has issues with that. If you've got rel=prev/next in place, then I wouldn't canonical to "View All", either. They're kind of competing signals. You can use rel=prev/next with rel=canonical, but it's a bit complicated. Basically, it's for situations where you have pagination AND some other parameter, like a sort.
(2) If you increase it, just make sure it doesn't negatively impact users or load-times (might be worth A/B testing, honestly). Are you saying that you might end up with a URL like "?page=7" which basically doesn't exist because now you'll have less pages? I think you might be safer just letting that 404 and have Google recrawl the new structure. The odds of having any links to Page 7 of search results (inbound links, that is) are very low, and just letting those pages die off may be safer.
-
I think the best solution to get something properly done on your website, if you're displaying a page with 20 products (by default) and it has a complicated extension to see the next one ( domain.com/?abc=123etc#321 ) you have a significant problem that you should be concerned about more - whether it's domain.com/category/page/1/ and page/2/.
In theory, page/1/ and page/2/ (blog style) contain the same content as the home page (/1/ or /). Some practices are noindex,follow for any page [2-∞). You should definitely consider rel=canonical across the site though. It's essential. As well as rel="next" rel="prev".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Titles Issue in Campaign Crawl Error Report
Hello All! Looking at my campaign I noticed that I have a large number of 'duplicate page titles' showing up but all they are the various pages at the end of the URL. Such as, http://thelemonbowl.com/tag/chocolate/page/2 as a duplicate of http://thelemonbowl.com/tag/chocolate. Any suggestions on how to address this? Thanks!
Technical SEO | | Rich-DC0 -
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
Duplicate pages or note? Variations just due to language changes?
I have some pages marked as duplicates, so I want to do what I can to solve the issues concerned. One issue concerns duplicates where the page content is indeed the same except for the language that the content is offered in. The URL for example of the documentation page of the site, in English is as follows:
Technical SEO | | PulseAnalytics
http://www.domain.com/support/documentation We then have the same content in German, French, Russian using the following URLs.
http://www.domain.com/de/support/documentation
http://www.domain.com/fr/support/documentation
http://www.domain.com/ru/support/documentation Each page has links to PDFs which are all in fact in English so the links to the docs are the same. Moz is flagging up all these pages as being duplicate content (which it is when translated back into English, but is not if you just consider that they are using completely different languages!) Has anyone any thoughts on how to solve this? Or is this something not to worry about / disregard? Many thanks Simon0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Canonical URLs in an eCommerce site
We have a website with 4 product categories (1. ice cream parlors, 2. frozen yogurt shops etc.). A few sub-categories (e.g. toppings, smoothies etc.) and the products contained in those are available in more than one product category (e.g. the smoothies are available in the "ice cream parlors" category, but also in the "frozen yogurt shops" category). My question: Unfortunately the website has been designed in a way that if a subcategory (e.g. smoothies) is available in more than 1 category, then itself (the subcategory page) + all its product pages will be automatically visible under various different urls. So now I have several urls for one and the same product: www.example.com/strawberry-smoothie|SMOOTHIES|FROZEN-YOGURT-SHOPS-391-2-5 and http://www.example.com/strawberry-smoothie|SMOOTHIES|ICE-CREAM-PARLORS-391-1-5 And also several ones for one and the same sub-category (they all include exactly the same set of products): http://www.example.com/SMOOTHIES-1-12-0-4 (the smoothies contained in the ice cream parlors category) http://www.example.com/SMOOTHIES-2-12-0-4 (the same smoothies, contained in the frozen yogurt shops category) This is happening with around 100 pages. I would add canonical tags to the duplicates, but I'm afraid that by doing so, the category (frozen yogurt shops) that contains several non-canonical sub-categories (smoothies, toppings etc.) , might not show up anymore in search results or become irrelevant for Google when searching for example for "products for frozen yoghurt shops". Do you know if this would be actually the case? I hope I explained it well..
Technical SEO | | Gabriele_Layoutweb0 -
Duplicate page/Title content - Where?
Hi, I have just run a crawl on a new clients site, and there is several 'duplicate page content' and 'Duplicate Page Title'' issues. But I cannot find any duplicate content. And to make matters worse. The actual report has confused me. Just for example the about us page is showing in both reports and for both under 'Other URLs' it is showing 1? Why? Does this mean there is 1 other page with duplicate page title? or duplicate page content? Where are the pages that have the duplicate page titles, or duplicate page content? I have run scans using other software and a copyscape scan. And apart from missing page titles, I cannot find any page that has duplicate titles or content. I can find % percentages of pages with similar/same page titles/content. But this is only partial and contextually correct. So I understand that SEO Moz may pick percentage of content, which is fine, and therefore note that there is duplicate content/page titles. But I cannot seem to figure out where I would the source of the duplicate content/page titles. As there is only 1 listed in both reports for 'Other URLs' Hopefully my long question, has not confused. many thanks in advance for any help
Technical SEO | | wood1e20 -
Noindex search result pages Add Classifieds site
Dear All, Is it a good idea to noindex the search result pages of a classified site?
Technical SEO | | te_c
Taking into account that category pages are also search result pages, I would say it is not a good idea, but the whole information is in the sitemap, google can index individual listings (which are index, follow) anyway. What would you do? What kind of effects has in the indexing of the site, marking the search result pages as "search results" with schema.org microdata? Many thanks for your help, Best Regards, Daniel0 -
Decreasing the size of a site to increase SEO value of remaining pages?
My website has thousands of pages and I have so many keywords on the bottom of page 1 and on page 2 of SERPs. I am considering making the site smaller to lessen the dilution of the overall domain authority and in theory the remainder pages should get pushed up in rank. Do you feel this theory is flawed? Is it better to 301 or remove the pages if they don't have backlinks directly to the internal page? These are pages I would re-enable down the road once overall domain authority is increased. thanks, David couponcactus.com
Technical SEO | | CouponCactus0