NOINDEX,FOLLOW on product pages
-
Hi
Can I have people's thoughts on something please. We sell wedding stationery and whilst we can generate lots of good content describing a particular range of stationery we can't relistically differentiate at a product level. So imagine we have three ranges
Range 1 - A Bird
Range 2 - A Heart
Range 3 - A Flower
Within each of these ranges we would have invitations, menus, place cards, magnets etc. The ranges vary quite alot so we can write good textual keyword rich descriptions that attract traffic (i.e. one about the bird, one about the heart and one about the flower). However the individual products within a range just reflect the design for the range as a whole (as all items in a range match). Therefore we can't just copy the content down to the product level and if we just describe the generic attributes of the products they will alll be very similar. We have over 1,000 "products" easily so I am conscious of creating too much duplication over the site in case Mr Panda comes to call.
So I was thinking that I "might" NOINDEX, FOLLOW the product pages to avoid this duplication and put lots of effort into making my category pages much better and content rich. The site would be smaller in the index BUT I do not really expect to generate traffic from the product pages because they are not branded items and any searches looking for particular features of our stationery would be picked up, much more effectively, by the category pages.
Any thoughts on this one?
Gary
-
Thanks for helping me bounce the ideas around. Always valuable comments from SeoMozzers! Have a good day!
-
Yes that's a very good idea. It is much stronger a signal in it's execution than the noindex/follow method given my concerns.
-
OK, thanks for your detailed response.
I am wondering whether I might just have a dynamically generated URL for the "product level" pages, i.e. page.php?id=1, page.php?id=2 etc and then have a canonical tag that is the same across them all. I could then limit the product pages to those that are genuinely different in some way. That way, I can avoid the noindex,follow issue, have very few duplicate product pages and avoid Panda related issues. Sound sensible?
Gary
-
The issue with a high volume of noindex,follow pages is understanding intent, and potentially having the message be confused. "We have x pages with noindex, follow - meaning "we don't think these pages are important enough to index, but the links on them are". Except if those links exist elsewhere, on enough indexed pages, what's the point being made?
Is it an attempt to artificially boost the signals for those pages that are linked by saying "look at all these extra links we have pointing to these other pages"? That's the concern especially since the implementation of over-optimization factors in algorithms. While it may not be Google's intent to devalue a site due to innocent behavior, their ability algorithmically to understand is limited.
Over the past year and a half I've seen more and more situations where Google's many layers of algorithmic decisions have resulted in client sites suffering because of a lack of human review that can determine "this was not an intentional attempt to over-optimize". I've seen it with internal linking, I've seen it when use of noindex/follow conflicts with canonical signals, and I've seen it where either of those conflicts with robots.txt instructions.
While no single case is guaranteed to be problematic (due to hundreds of factors being evaluated across multiple algorithms), at the same time, as a professional audit consultant I am not comfortable enough to then leave out the consideration where no single case is guaranteed to be safe either. Thus, my opinion of "best practices" is to "avoid potentially significant problems".
-
Alan Thanks for this. Can I check I understand your comments. Are u suggesting that a large number of noindex, follow pages causes google to lose interest in following the links from those pages? Do u know this to be the case through an empirical study? I like your suggestion of integrating the product purchase onto the category pages. I agree that would be ideal but the products themselves have alot of options and some are designed online so it could end up quite complex. Food for thought though as it would be a good solution SEO wise. I'm just a little concerned on a user ability front. Gary
-
One alternate method would be to integrate the products onto their individual "range" pages, with purchase capability and options right there. You'd need to ensure the overwhelming majority of those pages is still unique, however it would avoid potential confusion that comes from "noindex,follow" being used on a massive scale, which can itself be problematic. (Google needs to understand WHY there are so many nonidex pages, and what unique links exist on those pages that you want the crawler to follow them for).
-
Sounds like it makes sense and that you have thought it out. If the category pages are conversion friendly, sounds like it can be done. But if there is a way you can get the product pages to have unique content I would personally prefer the product pages to rank. By doing what you are suggesting you're putting the point of purchase a click further away, which isn't the end of the world.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Amazon Product Descriptions and our website's product descriptions
I am updating our product descriptions site-wide. I wanted to also update our amazon listings for those same products. Is that considered duplicate content if it would be on amazon and our site? Is there any reason why I wouldn't want to do that? Is google product ads also a problem?
Technical SEO | | EcomLkwd0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0 -
Link juice distributed to too many pages. Will noindex,follow fix this?
We have an e-commerce store with around 4000 product pages. Although our domain authority is not very high (we launched our site in February and now have around 30 RD's) we did rank on lots of long tail terms, and generated around 8000 organic visits / month. Two weeks ago we added another 2000 products to our existing catalogue of 2000 products, and since then our organic traffic dropped significantly (more than 50%). My guess is that link juice has been distributed to too many pages, causing rankings to drop on overall. I'm thinking about noindexing 50% of the product pages (the ones not receiving any organic traffic). However, I am not sure if this will lead to more link juice for the remaining 50% of the product pages, or not. So my question is: if I noindex,follow page A, will 100% of the linkjuice go to page B INSTEAD of page A, or will just a part of the link juice flow to page B (after flowing through page A first)? Hope my question is clear 🙂 P.s. We have a Dutch store, so the traffic drop is not a Panda issue 🙂
Technical SEO | | DeptAgency0