Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can noindexed pages accrue page authority?
-
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking).
I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content.
My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
-
Yes, Google will give you credit for adding value to pages. You must have them crawled as a Googlebot immediately after no indexing is removed.
Your no indexing will pass page rank of thin content could save you potentially from a penalty however if you have a better page redirected to that page using a 301.
You will not receive the existing traffic if your ranking for that keyword at all if you noindex it. Well, you'll lose a lot of it until it's fixed.
You will have more trouble ranking for that keyword if you remove the page from Google's index. However, if you feel your content is that thin I would recommend no indexing them if you are going to fix them. And you must be willing to fix them extremely soon. How are you going to rank for a term Organically if you no index it you will hurt it that is not currently getting traffic?
A NoIndex tag is an instruction to the search engines that you don’t want a page to be kept within their search results. You should use this when you believe you have a page that search engines might consider being of poor quality.
What does a noindex tag do?
- It is a directive, not a suggestion. I.e., Google will obey it, and not index the page.
- The page can still be crawled by Google.
- The page can still accumulate PageRank.
- The page can still pass PageRank via any links on the page.
(PageRank, in reality_, there are a lot of other signals that are potentially passed through any link. Better to say “signals passed” than “PageRank passed.”)_
Crawl frequency of a noindex page will decline over time.
Crawl frequency refers to how often Google returns to a page to check whether the page still exists, has any changes, and has accumulated or lost signals.
Typically crawl frequency will decline for any page that Google cannot index, for whatever reason. Google will try to recrawl a few times to check if the noindex, error, or whatever was blocking the crawl, is gone or fixed.
If the noindex instruction remains, Google will slowly start to lengthen the time to the next attempt to crawl the page, eventually reducing to a check about every two-to-three months to see if the no index tag is still there.
The no index page will be excluded from Google's search index, So it will not help you rank for that term unless you have other pages that are cannibalizing it and trying to rank for that term as well. If so 301 redirect the poor content page to the right content page.
Your question on page rank and no index yes page rank can accrue Google will still read the page. They will derive some information from the hypertext inside the URLs.
Before you remove content
The following are some guidelines you can use:
- Make an educated (non-biased) judgement: Is your content’s quality “worse” than this content?
- Do you cover the topic in enough length and sufficiently in-depth?
- Which aspects of this content is your page not covering completely?
- Which “user intent” queries is your content not answering?
- How can you make your content better?
- Can you use any great imagery or diagrams to supplement your content?
- Are there any YouTube or other videos which can add value to your content.
Iterate and do the above for all of the pages which are outranking yours. The first few are going to be the hardest — it’s likely that the rest will follow a similar pattern.
There are no short cuts. You’ll have to review all the pages which are outranking you to ensure you leave no gaps.
Update Your Content To Fully Answer The User Search Query
Once you’ve seen what you are up against, you need to update your content.
To put it simply, your content needs to be better than the competition. It also needs to fully answer the user search intent which we have identified previously.
Make it the BEST content out there.
Given that you’ve already analyzed your competitors’ content, you should have a pretty good idea of what your content is missing.
Supplement your existing content with that additional content, but
- Don’t rewrite it completely. You’ll likely lose the precious content that Google was ranking you for.
- Don’t write a new post with the hope that this will rank better. It’s a much longer and harder journey than pushing up your already existing content.
- Of course, don’t change the URL.
As discovered in this case study 468% traffic increase case study, Google will reward you for your efforts.
Use the judgment calls from your competitive research to plan what needs to be added or updated.
Enhance it with any missing content
While looking at the organic keywords which you are ranking for you might come across user search intent keywords for which you have no content.
Let’s say, for example; your content discusses enabling Joomla SEF URLs.
If in your research you find that you are ranking for “disabling Joomla SEF URLs,” make sure that your refreshed content answers that query also.
These queries are pure gold — make sure you are answering them
You can see a larger version of the photos below here
Reference
- http://www.hobo-web.co.uk/duplicate-content-problems/#thin-content-classifier
- https://www.stonetemple.com/gary-illyes-what-is-noindex-and-what-does-it-do/
- https://www.mattcutts.com/blog/pagerank-sculpting/
** when rebuilding**
- https://moz.com/learn/seo
- https://ahrefs.com/blog/link-building/
- https://moz.com/beginners-guide-to-link-building
- http://www.bruceclay.com/blog/what-is-pagerank/
this is similar because it addresses turning off pages and turning them back on
I hope this helps,
Tom
-
From a Google perspective if you noindex a page sooner or later it will be removed from the index and hence you will lose your search term.
If you have no particular need to remove the pages, create new pages with the new content (Google will like that anyway), almost certainly you will find that some of those pages will outrank the thin content pages by definition in time.
In due course you could then 301 the old link which in theory will pass on most of the authority to the new page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical tag from shopify page to wordpress site page
We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?
Intermediate & Advanced SEO | | shabbirmoosa0 -
How can I avoid duplicate content for a new landing page which is the same as an old one?
Hello mozers! I have a question about duplicate content for you... One on my clients pages have been dropping in search volume for a while now, and I've discovered it's because the search term isn't as popular as it used to be. So... we need to create a new landing page using a more popular search term. The page which is losing traffic is based on the search query "Can I put a solid roof on my conservatory" this only gets 0-10 searches per month according to the keyword explorer tool. However, if we changed this to "replacing conservatory roof with solid roof" this gets up to 500 searches per month. Muuuuch better! The issue is, I don't want to close down and re-direct the old page because it's got a featured snippet and sits in position 1. So I'd like to create another page instead... however, as the two are effectively the same content, I would then land myself in a duplicate content issue. If I were to put a rel="canonical" tag in the original "can I put a solid roof...." page but say the master page is now the new one, would that get around the issue?
Intermediate & Advanced SEO | | Virginia-Girtz0 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Why does Google rank a product page rather than a category page?
Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?
Intermediate & Advanced SEO | | Inevo0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0