Risk Using "Nofollow" tag
-
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages.
Questions:
-
By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique?
-
Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective.
-
I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be.
thank you very much
-
-
thx, Alan. Within real estate MLS - if I index all "MLS result pages" (ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/) I will have about 5,000 such MLS result pages (I mean 5,000 such category pages with each category often having more than 1 page). I have added unique quality content on Page 1 of about 300 such MLS result pages and I have added rel=next prev. For the other 4,700 pages I currently have "noindex, follow".
Question: is it OK to have such a large amount of pages with "noindex, follow" on or do I run the risk Google thinks "hmmm….though we do not index, seems like a lot of crap on this website….let us lower ranking even for the quality pages." Would I simply be better off letting everything index? I am concerned if I let those pages index that will dilute the value of my high quality pages. I am thinking if I completely delete those low relevancy pages from my website it would be ideal (in order for Google to see my site's value) but users looking to buy real estate would not see as many listings as on other websites and that could be a concern.
Any insight appreciated. thx
-
If you use nofollow, then every link pointing to those pages will throw away their link juice, you don't want that.
Follow means that link juice will flow though the links back to your indexed pages. Telling google not to index is doing them a favour as they don't want duplicates I don't think there any concern. -
it is a possibility it could be seen that way yes but that's generally unlikely but before you got a bit too much "into" nofollowing links etc. wanted to make you aware of it.
With the tag what you're sort of saying is"these pages are all very similar this is the first one and this is the last one" Google's pretty cleaver and most people don't give it credit if your site is about real estate etc. it will know your listings may be seen else where for example in the UK we have Rightmove & Zoopla they both list properties from else where but they also have value in other aspects of there sites which is why they work, so as long as your site is not just about the pages that are duplicates and you give worthy content on other areas generally you should be fine. Make the site really helpful for the user and the rest sort of falls into place you can also take the time to look at how they've solved the same problem.
Regards to the 3000 pages, if you can get some unique content on there fantastic but i know its not always easy. Your original question was about the risk of nofollow, there is no risk with it, now its really your choice with the noindex tag. I can imagine you can leave it on but you may risk not being all you can be, I would suggest taking a look at your competitors and other similar sites to get an idea of what they do in a similar situation.
you might find this answer helpful which is on the same subject - http://moz.com/community/q/real-estate-mls-listings-does-google-consider-duplicate-content
-
http://www.honoluluhi5.com/moana-pacific-i-2901-kakaako-condo-for-sale-201417440/ - I have 3000+ of such property pages which is shared amongst real estate firms across the web. Currently I have "noindex, follow". You would remove that tag and just let the pages index?
-
I am using rel=next prev. So maybe I should just drop the "noindex, follow" part, though many experts recommend using that tag. However, issue with these things (rel=next prev or "noindex, follow) is that Google will read the pages and may think "hmm....We've seen these real estate listings on many other sites and we therefore consider this low quality content..."
But you are saying don't use noindex type tags as it could be interpreted as sculpting?
-
You want to use the pagination tag like the canonical tag it will let you index the pages (sort of) but avoid duplicate content. Noindexing a site is a bit of a waste of SEO effort when there are other solutions so I'd leave that as a last ditch effort. If you've have unique content on the pages that's better than one (even if its low on the page)
What you don't want to do is make it look like your trying to manipulate your link juice / pagerank internally too much.
-
ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
As you scroll down you will see a lot of high quality and unique content, including aerial photos which are my company's. I have 300+ pages like that - unique and very high quality. I am in process of reducing size of may by 75% and move the unique content up much higher on the page, since I fear the unique content is placed too low on page and that could impact ranking.
Also, I currently have "noindex, follow" on page 2 to n since all those real estate listings are duplicate content since it is shared across 100+ Real estate companies across the web. I am thinking maybe I should make that those pages 2 - n "noindex, nofollow" so Google does not waste time reading those pages.
Any thoughts highly appreciated... thanks very much
-
I think you've got a bit lost there. By adding the noindex site it makes no difference if you have no follow or not. Even if you have bad content by no indexing most of your site its almost like you've got a one page site. I really recommend taking the time to write some content it pays off down the line and doesn't take as long as you think.
Matt Cutts has said most of the internet is duplicate content so don't over analyze it too much links etc. can make a fairly large impact as long as the bulk of your website is unique and authoritative you will be on a good road.
-
No index and no follow are nearly the same thing (okay take that comment with a heap of salt)
-
link juice would matter as Google is ignoring that part of your site as you've told it not to index it so any link juice going that way is just going into a black hole.
-
I think you heard wrong, no-follow is safer than follow because its like saying "i don't endorse this link" and so it doesn't transfer link juice but reduced any risks but remember trying to manipulate link juice on your site is a risky game and most of the time you will come off worse of than just writing some content for products
I would take a look over here if you needed more reasons not to - https://www.mattcutts.com/blog/pagerank-sculpting/
"Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea?
A: I wouldn’t recommend it" -
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"No Information Available" Error for Homepage in Google
Hi Everyone, Been racking my brain around this one. Not sure why it is happening. Basically Google is showing the "www" version of the homepage, when 99% of the site is "non-www". It also says "No Information Available". I have tried submitting it through GSC, but it is telling me it is blocked through the Robots.txt file. I don't see anything in there that would block it. Any ideas? shorturl.at/bkpyG I would like to get it to change to the regular "non-www" and actually be able to show information.
Intermediate & Advanced SEO | | vetofunk0 -
Is it good practice to use "SAVE $1000's" in SEO titles and Meta Descriptions?
Our company sells a product system that will permanently waterproof almost anything. We market it as a DIY system. I am working on SEO titles and descriptions. This topic came up for discussion, if using "SAVE $1000's.." would help or hurt. We are trying to create an effective call to action, but we are wondering if search engines see it as click bait. Can you
Intermediate & Advanced SEO | | tyler.louth0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Duplicate content across similar computer "models" and how to properly handle it.
I run a website that revolves around a niche rugged computer market. There are several "main" models for each computer that also has several (300-400) "sub" models that only vary by specifications for each model. My problem is I can't really consolidate each model to one product page to avoid duplicate content. To have something like a drop down list would be massive and confusing to the customer when they could just search the model they needed. Also I would say 80-90% of the market searches for a specific model when they go to purchase or in Google. A lot of our customers are city government, fire departments, police departments etc. they get a list of approved models and purchase off that they don't really search by specs or "configure" a model so each model number having a chance to rank is important. Currently we have all models in each sub category rel=canonical back to the main category page for that model. Is there a better way to go about this? Example page you can see how there are several models all product descriptions are the same they only vary by model writing a unique description for each one is an unrealistic possibility for us. Any suggestions on this would be appreciated I keep going back on forth on what the correct solution would be.
Intermediate & Advanced SEO | | The_Rugged_Store0 -
Using Hreflang Tags For Australian Domain Extension
Hi Guys, We have a company with a Australian domain www.domain.com.au which has just launched in the US market. The company is in the process of purchasing the .com version of the domain and then the plan is to have one single global .com site (like apple.com) on a new domain which would be domain.com and put both the (US version) and (Australian Version) on the new domain: domain.com (global). e.g. domain.com/us and domain.com/au However the .com version won't be available till March 2016. The company still wants to launch in the US market asap with it's current .com.au domain. which it has. So basically the current set-up is like this: http://www.domain.com.au/us/ (US homepage) http://www.domain.com.au/ (Australian homepage) I was wondering, does anyone know if hreflang tag can be used on a .com.au extension to target specific pages to the US. e.g. I was wondering will the hreflang tag override the fact that Google would automatically geo-target the .com.au extension to Australia? e.g. would the http://www.domain.com.au/us/ (US version) with the hreflang tag above be considered as the US version, even-though we it's on a .com.au domain extension? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Question about using abbreviation
Hello, I have this abbreviation inside my domain name, ok? now for a page URL name, do you recommend me to use the actual word (which shortened form of it is inside domain name) in a page name? Or when have abbreviation in domain name, then using its actual word in a page name is not good? It's all about how much google recognize abbreviation as the actual word and gives the same value of word to it? do I risk not using the actual word? Hope made myself clear ) thanks.
Intermediate & Advanced SEO | | mdmoz0 -
Is it ok to use both 301 redirect and rel="canonical' at the same time?
Hi everyone, I'm sorry if this has been asked before. I just wasn't able to find a response in previous questions. To fix the problems in our website regarding duplication I have the possibility to set up 301's and, at the same time, modify our CMS so that it automatically sets a rel="canonical" tag for every page that is generated. Would it be a problem to have both methods set up? Is it a problem to have a on a page that is redirecting to another one? Is it advisable to have a rel="canonical" tag on every single page? Thanks for reading!
Intermediate & Advanced SEO | | SDLOnlineChannel0 -
So what exactly does Google consider a "natural" link profile?
As part of my company's ongoing SEO effort we have been analyzing our link profile. A colleague of mine feels that we should be targeting at least 50% branded anchor text. He claims this is what search engines consider "natural" and we should not go past a threshold of 50% optimized anchor text to make sure we avoid any penalties or decrease in rankings. 50% brand term anchor text seems too high to me. I pointed out that most of our competitors who outrank us have a much greater percentage of optimized links. I've also read other industry experts state that somewhere in the range of 30% branded anchor text would be considered natural. What percent of branded vs. optimized anchor text do you feel looks "natural" and what do you base your opinion on?
Intermediate & Advanced SEO | | DeannaTallman0