Can you have too many NOINDEX meta tags?
-
Hi,
Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore.
There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow.
There is a high proportion of Meta No Index tags - can that harm our SEO efforts?
thanks,
Ben
-
no not if your not indexing it, your doing what they want, cleaning up the index, but still letting the crawler find links to other pages
-
Hi Alan,
I have NOINDEX, FOLLOW - it's to address the duplicate content and low content pages so that Google does not include them - some of the pages we are making a lot of content for.
My question really pertains to, if the bots see a site with a lot of NOINDEX, FOLLOW will they be worried that this site is overall low content??
-
I'm using the following meta tag
I'll add follow, better safe than sorry!
-
Yes you are correct, at least you would think so. but I would add the follow just in case.
I assume you are using the meta noindex and not robots as robots,txt will stop both index and follow I believe.
-
We have a high no index count due to search pages set as no index, would you recommend these search pages be no index, follow rather than just 'no index'?
I thought leaving 'no index' is enough as search engines would default to 'no index, follow'?
-
Yes, noindex means that any links pointing to those pages are wasting link juice. You should use noindex,follow to make sure that the link juice is returned though out links.
I know that many with CMS sites have 5this problem, but the numbers you are reporting do seem very large.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I leave off HTTP/HTTPS in a canonical tag?
We are working on moving our site to HTTPS and I was asked by my dev team if it is required to declare HTTP or HTTPS in the canonical tag? I know that relative URL's are acceptable but cannot find anything about HTTP/HTTPS. Example of what they would like to do Has anyone done this? Any reason to not leave off the protocol?
White Hat / Black Hat SEO | | Shawn_Huber0 -
I'm seeing thousands of no-follow links on spam sites. Can you help figure it out?
I noticed that we are receiving thousands of links from many different sites that are obviously disguised as something else. The strange part is that some of them are legitimate sites when you go to the root. I would say 99% of the page titles read something like : 1 Hour Loan Approval No Credit Check Vermont, go cash advance - africanamericanadaa.com. Can someone please help me? Here are some of the URL's we are looking at: http://africanamericanadaa.com/genialt/100-dollar-loans-for-people-with-no-credit-colorado.html http://muratmakara.com/sickn/index.php?recipe-for-cone-06-crackle-glaze http://semtechblog.com/tacoa/index.php?chilis-blue-raspberry-margarita http://wesleygcook.com/rearc/guaranteed-personal-loans-oregon.html
White Hat / Black Hat SEO | | TicketCity0 -
Unique meta descriptions for 2/3 of it, but then identical ending?
I'm working on an eCommerce site and had a question about my meta descriptions. I'm creating unique meta descriptions for each category and subcategory, but I'm thinking of adding the same ending to it. For example: "Unique descriptions, blah blah blah. Free Overnight Shipping..". So the "Free Overnight Shipping..." ending would be on all the categories. It's an ongoing promo so I feel it's important to add and attract buyers, but don't want to screw up with duplicate content. Any suggestions? Thanks for your feedback!
White Hat / Black Hat SEO | | jeffbstratton0 -
How many keywords?
Hi, I have a client asking if they can target 50-100 keywords. Has anyone ever heard of this before? In my eyes, 1-7 keywords at any one time is more than enough. So unless you had a team of 50 people doing the work, is this a reasonable request? Any advice welcome. Thanks
White Hat / Black Hat SEO | | YNWA0 -
Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
For example, lets say I have these 3 domains: product1.com product2.com product.com The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com) The purpose of this would be to capitalize on the Exact match domain opportunities. I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain. What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
White Hat / Black Hat SEO | | ClearVisionDesign0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Can i send a disavow if a detect a spam link
I have detected than one web domain is generating 2400 links to my site should a use a disavow tools, as it is imposible to have contact from webmaster and no response to your emails My web as not been warned or penalized, but i dont like this link, and i want to inform google of that,. If google acepts the disavow file, should i still see on my webmaster tools that web links, or will they desapear thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Can I get harmed by an inlink?
Hi! I'll jump right in to my question. There's a webpage with the following stats:
White Hat / Black Hat SEO | | mozalbin
PA 80, mR 4.70, mT 5.00. Pagerank ZERO. Now, these are some beautiful stats for every webpage, except for the pagerank. The reason to why the pagerank is so low is that the inlinks to this site is partial spammy (hidden links and other bad naughty black-hat stuff that I hate). (It's not my webpage, I don't even know whos webpage this is..) I happen to have a backlink from this page. A clean dofollow, in-content link to my site. The total amount of external links on this page is five and there's no spam on the page or hidden anywhere else. My question #1:
Is this particular inlink to my site harmful? Will I get penaltized for getting a backlink from this site? I mean, Google have figured out the spam factor of the links to the page that is linking to me. But I'm innocent, the link to me is just lying there... (Why or why not?) My question #2:
IF (and only IF) the link to my webpage is harmful. Are links from my page harmful? (Why or why not?) Thank you very much for using you awesome knowledge to answer this 🙂0