Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does having alot of pages with noindex and nofollow tags affect rankings?
-
We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace. Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content)
Our traffic dropped by 45% and we have since not recovered. We have done
I am wondering could these tags be affecting our rankings?
-
Hi Gaston
Thank you for the detailed response and suggestions. I will follow up with my findings. Point 3 and 4; - I think there is something there.
James
-
Hi James,
Great that you've checked out those items and there aren't errors.
I'd break my response into bullet points so its easier to respond

1- I'm bugged that the traffic loss occurs in the same month as the https redirection.
That completely tells me that you've either killed, redirected or noindexed some pages that drove a lot of traffic.
2- Also it could be possible that you didn't deserve that much traffic due to either being ranked on searches that you weren't relevant or Google didn't fully understand your site. That often happens when migration takes places, as Google needs to re-calculate and fully understand the new site.3- If you have still on the old HTTP search Console property, I'd check as many (and in some scalable way) keywords as possible, trying to find which have fallen out in rankings.
4- When checking those keywords, compare URLs that were ranked, there could be some changes.5- And lastly, have you made sure that there aren't any indexation and/or Crawlability issues? Check the raw number of indexable URLs and compare it with the number that Search Console shows in the index coverage report.
Best wishes.
GR -
Hi Gaston
Thank you for sharing your insights.
1. I have looked through all the pages and made sure we have not noindexed important pages
2. The migration went well; no double redirects or duplicate content.
3. I looked through Google search console - Fixed all the errors; (mostly complains about 404 error caused by products that are out of stock or from vendors who leave the website)
4. A friend said he thinks our pages are over-optimized - and hence that could be the reason; We went ahead and tweaked all the pages that were driving traffic; but change.
If you have a moment here is our website: www.rebelsmarket.com - If there is anything that standsout please let me know. I appreciate your help
James
-
Hi Joe
We have applied all the redirects carefully and tested them to make sure; we have no duplicate content
The url: www.rebelsmarket.com
Redirect to SSL: March 2018 (we started with the blog and then moved to products page)
We added; noindex and nofollow tags at the sametime;
Thank you
James
-
Hi John
Sorry, I have been tied up with travel schedule. Here is the website www.rebelsmarket.com
Thank you for your help John
-
Hi James,
Yiut issues lie elsewhere - did anything else happen during the update? My first thoughts are that the redirects were incorrectly applied.
- Whats the URL?
- When was the redirect HTTP > HTTPS installed & how?
- When was noindex and nofollow tags added?
You're a month in, so you should be able to recover. Sharing the URL would be useful if you need any further assistance.
-
Hey James - would you be comfortable sharing the URL? I can run some diagnostics on it to see what other issues could be the cause of the drop.
Thanks!
John
-
Hi James,
I'm sorry to hear that you've lost over 45% of your traffic.
Absolutely not, having a lot of noindex and nofollow pages won't affect your rankings and your SEO strength.On the other hand, a traffic drop could be related to many issues, some of them:
- Algorithm changes, there has been a lot of movement this year
- You've noindexed some of your high traffic pages
- Some part of the migration gone wrong
- And the list could be endless.
I'd start checking Search Console, there you could spot which keywords and/or URLs are those that aren't ranking that high.
It might come handy, this sort of tutorial on analyzing a traffic drop: How to Diagnose SEO Traffic Drops: 11 Questions to Answer - Moz Blog
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Reduced Bounce Rate, Increased Pages/Session, Increased Session Duration-RESULT IN BETTER RANKING?
Our relaunched website has a much lower bounce rate (66% before, now 58%) increased pages per session (1.89 before, now 3.47) and increased session duration (1:33 before, now 3:47). The relaunch was December 20th. Should these improvements result in an improvement in Google rank? How about in MOZ authority? We have not significantly changed the content of the site but the UX has been greatly improved. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Ranking 1st for a keyword - but when 's' is added to the end we are ranking on the second page
Hi everyone - hope you are well. I can't get my head around why we are ranking 1st for a specific keyword, but then when 's' is added to the end of the keyword - we are ranking on the second page. What could be the cause of this? I thought that Google would class both of the keywords the same, in this case, let's say the keyword was 'button'. We would be ranking 1st for 'button', but 'buttons' we are ranking on the second page. Any ideas? - I appreciate every comment.
Intermediate & Advanced SEO | | Brett-S0 -
Does sharing same Business Name affect Google ranking?
Hey guys, We have been working for a client who is offering graphic design work almost 2 months. It is a new business and let's say the business name is ABC Graphic Design. So far all the pages are indexed, we built natural links through local directories, blog postings on relevant niche blogs and social media. We optimised the content and meta tags like we always do. However, none of the target keywords appear on the first 10 pages. This is quite odd considering we had a client who was doing the same business and we managed to show some progress in the first 2 months. We did some research and noticed that there are 2 ABC design websites with similar domain names and offering same services. They have nothing to do with my client and they are located in overseas. When i search ABC Graphic Design, the results show other companies instead of my client. My question is whether having a similar business name would affect the ranking. Obviously the other 2 websites have longer history and better ranking. Any suggestions?
Intermediate & Advanced SEO | | owengna0 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
Noindex : Do Follow or No Follow Tags?
Hello, I have a website with tags (which have the noindex tag) on each article post. I've been told that I should noindex/nofollow these tag pages, because they are getting link juice passed to them, and since they aren't getting indexed, it's wasting link juice to those pages, when the link juice could be passed to a page that is actually getting indexed. What are your thoughts on this? Also, what would be the point to noindex/follow a page, if you are noindexing that page? Isn't it just wasting link juice? What is the proper SEO way to optimize tags.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Duplicate internal links on page, any benefit to nofollow
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link? Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute? I would appreciate the experiences and thoughts Mozzers thoughts on this thanks in advance!
Intermediate & Advanced SEO | | JustinTaylor880 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0