Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does having alot of pages with noindex and nofollow tags affect rankings?
-
We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace. Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content)
Our traffic dropped by 45% and we have since not recovered. We have done
I am wondering could these tags be affecting our rankings?
-
Hi Gaston
Thank you for the detailed response and suggestions. I will follow up with my findings. Point 3 and 4; - I think there is something there.
James
-
Hi James,
Great that you've checked out those items and there aren't errors.
I'd break my response into bullet points so its easier to respond
1- I'm bugged that the traffic loss occurs in the same month as the https redirection.
That completely tells me that you've either killed, redirected or noindexed some pages that drove a lot of traffic.
2- Also it could be possible that you didn't deserve that much traffic due to either being ranked on searches that you weren't relevant or Google didn't fully understand your site. That often happens when migration takes places, as Google needs to re-calculate and fully understand the new site.3- If you have still on the old HTTP search Console property, I'd check as many (and in some scalable way) keywords as possible, trying to find which have fallen out in rankings.
4- When checking those keywords, compare URLs that were ranked, there could be some changes.5- And lastly, have you made sure that there aren't any indexation and/or Crawlability issues? Check the raw number of indexable URLs and compare it with the number that Search Console shows in the index coverage report.
Best wishes.
GR -
Hi Gaston
Thank you for sharing your insights.
1. I have looked through all the pages and made sure we have not noindexed important pages
2. The migration went well; no double redirects or duplicate content.
3. I looked through Google search console - Fixed all the errors; (mostly complains about 404 error caused by products that are out of stock or from vendors who leave the website)
4. A friend said he thinks our pages are over-optimized - and hence that could be the reason; We went ahead and tweaked all the pages that were driving traffic; but change.
If you have a moment here is our website: www.rebelsmarket.com - If there is anything that standsout please let me know. I appreciate your help
James
-
Hi Joe
We have applied all the redirects carefully and tested them to make sure; we have no duplicate content
The url: www.rebelsmarket.com
Redirect to SSL: March 2018 (we started with the blog and then moved to products page)
We added; noindex and nofollow tags at the sametime;
Thank you
James
-
Hi John
Sorry, I have been tied up with travel schedule. Here is the website www.rebelsmarket.com
Thank you for your help John
-
Hi James,
Yiut issues lie elsewhere - did anything else happen during the update? My first thoughts are that the redirects were incorrectly applied.
- Whats the URL?
- When was the redirect HTTP > HTTPS installed & how?
- When was noindex and nofollow tags added?
You're a month in, so you should be able to recover. Sharing the URL would be useful if you need any further assistance.
-
Hey James - would you be comfortable sharing the URL? I can run some diagnostics on it to see what other issues could be the cause of the drop.
Thanks!
John
-
Hi James,
I'm sorry to hear that you've lost over 45% of your traffic.
Absolutely not, having a lot of noindex and nofollow pages won't affect your rankings and your SEO strength.On the other hand, a traffic drop could be related to many issues, some of them:
- Algorithm changes, there has been a lot of movement this year
- You've noindexed some of your high traffic pages
- Some part of the migration gone wrong
- And the list could be endless.
I'd start checking Search Console, there you could spot which keywords and/or URLs are those that aren't ranking that high.
It might come handy, this sort of tutorial on analyzing a traffic drop: How to Diagnose SEO Traffic Drops: 11 Questions to Answer - Moz Blog
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
Do Page Anchors Affect SEO?
Hi everyone, I've been researching for the past hour and I cannot find a definitive answer anywhere! Can someone tell me if page anchors affect SEO at all? I have a client that has 9 page anchors on one landing page on their website - which means if you were to scroll through their website, the page is really really long! I always thought that by using page anchors instead of sending users through to a dedicated landing page, ranking for those keywords makes it harder because a search spider will read all the content on that landing page and not know how to rank for individual keywords? Am I wrong? The client in particular sells furniture, so on their landing page they have page anchors that jump the user down to "tables" or "chairs" or "lighting" for example. You can then click on one of the product images listed in that section of the page anchor and go through to an individual product page. Can anyone shed any light on this? Thanks!
Intermediate & Advanced SEO | | Virginia-Girtz1 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
Do low quality subdomains affect the ranking performance/quality of a root domain?
Hi, Late last year the company I work for launched two new websites that, at the time, we believed were completely separate from our main website. The two new websites were set up externally and were not well-planned from an SEO perspective (LOTS of duplicate content) - hence, they have struggled to rank on Google. Since the launch of the new websites we have also noticed that our main website (that previously ranked very well) has suffered a decline in visitation and search engine rank. We initially attributed this to a number of factors, including the state of the market, and ramped up our SEO efforts (seeing minor improvement). We have since realised that these two new websites have been set up as subdomains of our main website, with MOZ displaying the same domain authority and root domain backlink profile. My question is, do poor quality subdomains affect the ranking performance of a root domain? I have not yet managed to find a definitive answer. Please let me know if more information is required - I am quite new to the whole SEO concept. Thanks! Amy
Intermediate & Advanced SEO | | paulissai0 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Duplicate internal links on page, any benefit to nofollow
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link? Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute? I would appreciate the experiences and thoughts Mozzers thoughts on this thanks in advance!
Intermediate & Advanced SEO | | JustinTaylor880