ECommerce site - Duplicate pages problem.
-
We have an eCommerce site with multiple products being displayed on a number of pages.
We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find.
-
Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate.
-
We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?.
General thoughts if anyone has similar problems welcome
-
-
Many thanks , you have been most helpful.
Yes, I see your point. I think we will have a look at implementing this on a couple of categories where we can monitor traffic and rankings . Then if it looks good, then will roll it out to the rest of the site.
Thank you.
Sarah
-
Essentially yes - pages 2+ of search just look "thin" to Google. They tend to have similar title tags, META descriptions, etc., and Google honestly isn't all that fond of indexing search pages in the first place (they don't want their search to land on your search). Those 2+ pages also don't tend to attract links or make a lot of sense for someone landing on them. By using META NOINDEX,FOLLOW, Google can crawl those searches to deeper pages, but the actually search pages don't dilute your overall site and search index.
Google's preferred method (or so they say) in 2012 is rel=prev/next, but I find that implementation can be much trickier than META NOINDEX. It's a difficult topic, and I honestly find that the ideal approach varies wildly from site to site. It's important to plan well, implement careful, and measure the results.
-
Hi Peter,
Many thanks for your answer. Very comprehensive and much appreciated There's certainly some good suggestions here.
Just quickly you mention about putting a NOINDEX FOLLOW on every page from 2 or 3 onwards.I take it , that's because later pages don't rank to well ?.
Is that the suggestion so the idea behind it that the link juice is being diluted to much. By Keeping only the first 2 pages say indexed etc, I would stand a better chance of ranking higher.
I will pass your suggestions on to my developer and see what we can come up from it. Will monitor and report back , hopefully with a sorted solutioin.
Once again , many thanks for sound advice.
Sarah.
-
Unfortunately, pagination + sorts gets ugly fast. Technically, the rel=prev/next tag should contain the sort parameter AND then you should canonical to the main pagination page. So, for example if you had a page like:
www.example.com/search.php?page=2&sort=asc
You should have tags like:
- Rel=Prev: http://www.example.com/search.php?page=1&sort=asc
- Rel=Next: http://www.example.com/search.php?page=3&sort=asc
- Canonical: http://www.example.com/search.php?page=2
In practice, it's incredibly hard to implement. So, you could do a couple of things:
(1) Block the sort_by parameter with Google Webmaster Tools parameter handling
(2) Use META NOINDEX, FOLLOW on all pages 2+ of search and sort URLs
I don't find Robots.txt works that well, in practice, and 800K blocked URLs can make Google jump. I'm actually confused by how Google is crawling the sorts at all (since they're form-driven). It looks like you put the sorts in your pagination links. Would it be possible to store any sorts in a cookie or session variable and not add those to links?
Given your current situation, and that Google has indexed thousands of sort URLs (from what I can see), I think the Google Webmaster Tools approach might be the safest. This is a complex problem, though, and you may need to consult someone.
-
Hi ,
In Answer to your point on to Question 2 , Currently the maximum number of pages we have is 4 pages plus a View All for a few of our products but most products are split on 2 pages plus a view all.
For the largest product example we have 83 products broken down as Page 1 to 4 has 20 products , page5 has 3 and View all - 83 products. rel Prev and rel Next are on the pages and View all has Nothing on it (Is that okay). The title tags are duplicated on the numerous pages , so I was going to add in page 2, 3, 4 etc to sort that.
I was going to increase the number of products per page to 30 , which would in effect put me down to 3 pages plus View all but more importantly , I thought I would also get stronger link value and less dilution hence better SEO .
The pages don't rank partially well at all well but on google speed test, I think we score 85/100 anyway , so from a speed point of view, it should'nt be a problem. Was just worried, that big changes like this could have a dramatic effect .
The url incase your interested is http://www.bestathire.co.uk/rent/Scaffold_towers/266
Many thanks
Very much appreciated.
Sarah.
-
Hi ,
Many Thanks for your reply,
We do have pagination and sorts like listing products a-z , z-a , price low to high and high to low etc which all generate different urls but we have put in the robot.txt file for google not to spider them. See below .
Also from looking at WMT is says it has blocked886,996 url's in the past 90 days. Our site has approx 54,000 indexed pages.
Disallow: */sort_by:Product.price%20ASC
Disallow: */sort_by:Product.price%20DESC
Disallow: */sort_by:Product.title%20ASC
Disallow: */sort_by:Product.title%20DESC
Disallow: */sort_by:Product.distance%20ASC
Disallow: */sort_by:Product.distance%20DESC
Disallow: */stealth:onAre you suggesting we do the Canonical the sorts aas well for saftey incase we have missed anything ?
Sarah
-
(1) DON'T canonical to the first page of results - Google definitely has issues with that. If you've got rel=prev/next in place, then I wouldn't canonical to "View All", either. They're kind of competing signals. You can use rel=prev/next with rel=canonical, but it's a bit complicated. Basically, it's for situations where you have pagination AND some other parameter, like a sort.
(2) If you increase it, just make sure it doesn't negatively impact users or load-times (might be worth A/B testing, honestly). Are you saying that you might end up with a URL like "?page=7" which basically doesn't exist because now you'll have less pages? I think you might be safer just letting that 404 and have Google recrawl the new structure. The odds of having any links to Page 7 of search results (inbound links, that is) are very low, and just letting those pages die off may be safer.
-
I think the best solution to get something properly done on your website, if you're displaying a page with 20 products (by default) and it has a complicated extension to see the next one ( domain.com/?abc=123etc#321 ) you have a significant problem that you should be concerned about more - whether it's domain.com/category/page/1/ and page/2/.
In theory, page/1/ and page/2/ (blog style) contain the same content as the home page (/1/ or /). Some practices are noindex,follow for any page [2-∞). You should definitely consider rel=canonical across the site though. It's essential. As well as rel="next" rel="prev".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce site impressions and clicks drop
Hello, A new client came to me with their ecommerce kids clothes website 6 weeks ago. I installed Yoast SEO plugin and set to work changing all their products to proper words rather than codes and optimising titles and descriptions. I did no link building. The domain is new- about 2- 3 months only so has very few links. I added it to webmaster tools and submitted the sitemap. Traffic was good during this time and infact the impressions in webmaster tools and clicks were increasing. It seemed to be punching beyond its weight actually with some keywords on page 1 which I thought odd for such a new domain in such a competitive arena. Then on the 4th October the impressions and clicks fell drastically. The traffic is about a third of what it was. Now I don't think this was anything Penguin-ish as the domain is so new with no links yet. I know there was a Panda update on the 25th September. Could it be that? All I have done is changed the titles to something more human and I thought Google appeared to like that as traffic was increasing. Could it be that now everything is indexed that it has settled down to its proper position in the rankings which is currently low? We added another way of categorising the products by brand as on the site their USP is their designer brands. I have checked for duplication but as far as I can see this isn't an issue. Anyone seens this before?
Technical SEO | | AL123al0 -
Sites for English speaking countries: Duplicate Content - What to do?
HI, We are planning to launch sites specific to target market (geographic location) but the products and services are similar in all those markets as we sell software.So here's the scenario: Our target markets are all English speaking countries i.e. Britain, USA and India We don't have the option of using ccTLD like .co.uk, co.in etc. How should we handle the content? Because product, its features, industries it caters to and our services are common irrespective of market. Whether we go with sub-directory or sub-domain, the content will be in English. So how should we craft the content? Is writing the unique content for the same product thrice the only option? Regards
Technical SEO | | IM_Learner0 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
Has anyone had problems with google webmaster tools verified sites
Hi, i have just been into google webmaster tools and i have noticed that five of my websites are no longer verified. i have tried putting the code back into the head and also i have tried verifying it through google analaystics but nothing is working can anyone let me know what has happened and if anyone has noticed this regards
Technical SEO | | ClaireH-1848860 -
Does adding a YouTube video to a page decrease site speed?
If you embed a YouTube video on your page, does Google count that as part of their site speed calculation. Since it is in a iFrame, I would think that it is not counted.
Technical SEO | | ProjectLabs0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0 -
Duplicate Page title - PHP Experts!
After running a crawl diagnostics i was surprised to see 336 duplicate page titles. But i am wondering if it is true or not. Most of them are not a page at all but a .php variation. for example: The following are all the same page, but a different limit on viewing listings. Limiting your view to 5, 10, 15, 20, 25 as you choose. .com/?lang=en&limit=5 .com/?lang=en&limit=5&limitstart=10
Technical SEO | | nahopkin
.com/?lang=en&limit=5&limitstart=15
.com/?lang=en&limit=5&limitstart=20
.com/?lang=en&limit=5&limitstart=25 Same type of things are going on all over the site causing 228 duplicate content errors and the already mentioned 336 duplicate pages. But is "crawl diagnostic telling the truth" or is it just some php thing? I am not a php expert so any input would be much appreciated. What should i do?0 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0