ECommerce site - Duplicate pages problem.
-
We have an eCommerce site with multiple products being displayed on a number of pages.
We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find.
-
Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate.
-
We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?.
General thoughts if anyone has similar problems welcome
-
-
With pagination if you have correctly implemented rel="next" and rel="prev" , you should be fine . But one issue I have seen with my clients is that , eCommerce sites usually have multiple ways to sort pages
- higest price
- lowest price
- a-z
- z-a
- etc
all these are duplicate content if you have not used rel="Canonical" . Hope this helps you out
-
In answer to question 1 Google don't officially recommend so I wouldn't. (and I doubt anyone with evidence that there is added benefit from it would share the evidence.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Duplicate pages
Hi Can anyone tell me why SEO MOZ thinks these paes are duplicates when they're clearly not? Thanks very much Kate http://www.katetooncopywriter.com.au/how-to-be-a-freelance-copywriter/picture-1-58/ http://www.katetooncopywriter.com.au/portfolio/clients/other/ http://www.katetooncopywriter.com.au/portfolio/clients/travel/ http://www.katetooncopywriter.com.au/webservices/what-i-do/blog-copywriter/
Technical SEO | | ToonyWoony0 -
Site Map Problems or Are They?
According to webmaster tools my Sitemap contains urls which are blocked by robots.txt Our site map is generically generated and encompasses all web pages, whether I have excluded them using the robots.txt file As far as I am aware this has never been an issue until recently. Is this hurting my rankings and how do I fix it? Secondly, webmaster tools says there is over 5,000 error/warnings on my site map. But site map is only 1,400 or so pages submitted. How do I see what is going on?
Technical SEO | | Professor0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
How many pages should my site have?
Right now I think I only have 36. What is a good amount of pages to have? Any ideas on ways to add relevant pages to my site? I was thinking about starting a message board. Also, I have a free tech support chat room, and was thinking about posting the logs somewhere on the site. Does that sound like a good idea? Thanks.
Technical SEO | | eugenecomputergeeks0 -
Home Page Indexing Question/Problem
Hello Everyone, Background: I recently decided to change the preferred domain settings in WM Tools from the non www version of my site to the www version. I did this because there is a redirect from the non www to the www and I've built all of my internal links with the www. Everything I read on SEO Moz seemed to indicate that this was a good move. Traffic has been down/volatile but I think it's attributable mostly to a recent site change/redesign. Having said that the preferred domain change did seem to drop traffic an additional notch. I made the move two weeks ago. Here is the question: When I google my site, the home page shows up as the site title without the custom title tags I've written. The page that displays in the SERP is still the non www version of the site. a site:www.mysite.com search shows an internal page first but doesn't return the home page as a result. All other pages pop up indexed with the www version of the page. a site:mysite.com (notice lack of www) search DOES SHOW my home page and my custom title tags but with a non www version of the page. All other pages pop up indexed with the www version of the page. Any one have thoughts on this? Is this a classic example of waiting on Google to catch up with the changes to my tiny little site?
Technical SEO | | JSOC0 -
Google has not indexed my site in over 4 weeks, what's the problem?
We recently put in permanent redirects to our new url, but Google seems to not want to index the new url. There was no problems with the old url and the new url is brand new so should have no 'black marks' against it. We have done everything we can think off in terms of submitting site maps, telling google our url has changed in webmaster tools, mentioning the new url on social sites etc...but still nothing. It has been over 4 weeks now since we set up the redirects to the url, any ideas why Google seems to be choosing not to index it? Thanks
Technical SEO | | cewe0