Risk Using "Nofollow" tag
-
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages.
Questions:
-
By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique?
-
Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective.
-
I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be.
thank you very much
-
-
thx, Alan. Within real estate MLS - if I index all "MLS result pages" (ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/) I will have about 5,000 such MLS result pages (I mean 5,000 such category pages with each category often having more than 1 page). I have added unique quality content on Page 1 of about 300 such MLS result pages and I have added rel=next prev. For the other 4,700 pages I currently have "noindex, follow".
Question: is it OK to have such a large amount of pages with "noindex, follow" on or do I run the risk Google thinks "hmmm….though we do not index, seems like a lot of crap on this website….let us lower ranking even for the quality pages." Would I simply be better off letting everything index? I am concerned if I let those pages index that will dilute the value of my high quality pages. I am thinking if I completely delete those low relevancy pages from my website it would be ideal (in order for Google to see my site's value) but users looking to buy real estate would not see as many listings as on other websites and that could be a concern.
Any insight appreciated. thx
-
If you use nofollow, then every link pointing to those pages will throw away their link juice, you don't want that.
Follow means that link juice will flow though the links back to your indexed pages. Telling google not to index is doing them a favour as they don't want duplicates I don't think there any concern. -
it is a possibility it could be seen that way yes but that's generally unlikely but before you got a bit too much "into" nofollowing links etc. wanted to make you aware of it.
With the tag what you're sort of saying is"these pages are all very similar this is the first one and this is the last one" Google's pretty cleaver and most people don't give it credit if your site is about real estate etc. it will know your listings may be seen else where for example in the UK we have Rightmove & Zoopla they both list properties from else where but they also have value in other aspects of there sites which is why they work, so as long as your site is not just about the pages that are duplicates and you give worthy content on other areas generally you should be fine. Make the site really helpful for the user and the rest sort of falls into place you can also take the time to look at how they've solved the same problem.
Regards to the 3000 pages, if you can get some unique content on there fantastic but i know its not always easy. Your original question was about the risk of nofollow, there is no risk with it, now its really your choice with the noindex tag. I can imagine you can leave it on but you may risk not being all you can be, I would suggest taking a look at your competitors and other similar sites to get an idea of what they do in a similar situation.
you might find this answer helpful which is on the same subject - http://moz.com/community/q/real-estate-mls-listings-does-google-consider-duplicate-content
-
http://www.honoluluhi5.com/moana-pacific-i-2901-kakaako-condo-for-sale-201417440/ - I have 3000+ of such property pages which is shared amongst real estate firms across the web. Currently I have "noindex, follow". You would remove that tag and just let the pages index?
-
I am using rel=next prev. So maybe I should just drop the "noindex, follow" part, though many experts recommend using that tag. However, issue with these things (rel=next prev or "noindex, follow) is that Google will read the pages and may think "hmm....We've seen these real estate listings on many other sites and we therefore consider this low quality content..."
But you are saying don't use noindex type tags as it could be interpreted as sculpting?
-
You want to use the pagination tag like the canonical tag it will let you index the pages (sort of) but avoid duplicate content. Noindexing a site is a bit of a waste of SEO effort when there are other solutions so I'd leave that as a last ditch effort. If you've have unique content on the pages that's better than one (even if its low on the page)
What you don't want to do is make it look like your trying to manipulate your link juice / pagerank internally too much.
-
ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
As you scroll down you will see a lot of high quality and unique content, including aerial photos which are my company's. I have 300+ pages like that - unique and very high quality. I am in process of reducing size of may by 75% and move the unique content up much higher on the page, since I fear the unique content is placed too low on page and that could impact ranking.
Also, I currently have "noindex, follow" on page 2 to n since all those real estate listings are duplicate content since it is shared across 100+ Real estate companies across the web. I am thinking maybe I should make that those pages 2 - n "noindex, nofollow" so Google does not waste time reading those pages.
Any thoughts highly appreciated... thanks very much
-
I think you've got a bit lost there. By adding the noindex site it makes no difference if you have no follow or not. Even if you have bad content by no indexing most of your site its almost like you've got a one page site. I really recommend taking the time to write some content it pays off down the line and doesn't take as long as you think.
Matt Cutts has said most of the internet is duplicate content so don't over analyze it too much links etc. can make a fairly large impact as long as the bulk of your website is unique and authoritative you will be on a good road.
-
No index and no follow are nearly the same thing (okay take that comment with a heap of salt)
-
link juice would matter as Google is ignoring that part of your site as you've told it not to index it so any link juice going that way is just going into a black hole.
-
I think you heard wrong, no-follow is safer than follow because its like saying "i don't endorse this link" and so it doesn't transfer link juice but reduced any risks but remember trying to manipulate link juice on your site is a risky game and most of the time you will come off worse of than just writing some content for products
I would take a look over here if you needed more reasons not to - https://www.mattcutts.com/blog/pagerank-sculpting/
"Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea?
A: I wouldn’t recommend it" -
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using hreflang="en" instead of hreflang="en-gb"
Hello, I have a question in regard to international SEO and the hreflang meta tag. We are currently a B2B business in the UK. Our major market is England with some exceptions of sales internationally. We are wanting to increase our ranking into other english speaking countries and regions such as Ireland and the Channel Islands. My research has found regional google search engines for Ireland (google.ie), Jersey (google.je) and Guernsey (google.gg). Now, all the regions have English as one their main language and here is my questions. Because I use hreflang=“en-gb” as my site language, am I regional excluding these countries and islands? If I used hreflang=“en” would it include these english speaking regions and possible increase the ranking on these the regional search engines? Thank you,
Intermediate & Advanced SEO | | SilverStar11 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
Using a US CDN (Cloudflare) for a UK Site. Should I use a UK Based CDN as it says my server is based in USA
Hi All, We are a UK Company with Uk customers only and use CloudFlare CND. Our Site is hosted by a UK company with servers here but from looking online and checking where my site is hosted etc etc , some sites are telling me the name of our UK Hosted company and other sites are telling me my site is hosted in San Fran (USA) , where I presume the Cloudflare is based. I know Cloudflare has a couple of servers in the UK it uses but given all my customers are UK based ,I don't want this is affect rankings etc , as I thought it was a ranking benefit to be hosted in the country you are based. Is there any issue with this and should I change or is google clever enough to know so i shouldn't worry. thanks Pet
Intermediate & Advanced SEO | | PeteC120 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Hidden Content with "clip"
Hi We're relaunching a site with a Drupal 7 CMS. Our web agency has hidden content on it and they say it's for Accessibility (I don't see the use myself, though). Since they ask for more cash in order to remove it, the management is unsure. So I wanted to check if anyone knows whether this could hurt us in search engines. There is a field in the HTML where you can skip to the main content: Skip to main content The corresponding CSS comes here: .element-invisible{position:absolute !important;clip:rect(1px 1px 1px 1px);clip:rect(1px,1px,1px,1px);} #skip-link a,#skip-link a:visited{position:absolute;display:block;left:0;top:-500px;width:1px;height:1px;overflow:hidden;text-align:center;background-color:#666;color:#fff;} The crucial point is that they're hiding the text "skip to main content", using clip:rect(1px 1px 1px 1px), which shrinks the text to one pixel. So IMO this is hiding content. How bad is it? PS: Hope the source code is sufficient. Ask me if you need more. Thx!
Intermediate & Advanced SEO | | zeepartner0 -
Use of <h2class="hidden">- SEO implications</h2class="hidden">
I'm just looking at a website with <h2class="hidden">Main Navigation and <h2class="hidden">Footer inserted on each page, and am wondering about the SEO implications.
Intermediate & Advanced SEO | | McTaggart
<a></a><a></a><a></a><a></a></h2class="hidden"></h2class="hidden">0 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0 -
H1 tag proper uses
Ok I see this happening all of the time. I get my hands on a new website and there are one of four header tag issues: 1. There are no H1 tags at all 2. There are multiple H1 tags on the same page 3. Every page has an identical H1 tag 4. Header tags are used all out of order Do any of these have a negative impact on rankings? I've always tried to get one H1 tag on each page, have it be the first header tag, and make it unique to each page. Is this a waste of itme? Could improper header tag use hurt a website?
Intermediate & Advanced SEO | | DanDeceuster1