Pages to be indexed in Google
-
Hi,
We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages.
Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone.
My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate.
If we need to remove what needs to be done? Robots block or Noindex/Nofollow
Regards
-
Thank you Gagan!
-
Its a much better and clear explanation... +1 to it. Cheers !!
-
One key point on using robots.txt vs the meta tag noindex. It is not that the noindex meta tag is "superior" they just work differently.
If you use robots.txt - it will stop the spider from visiting that page, but it will not remove the page from the index. Also, if you have a page in robots.txt and on that page have a 301 redirect, or a canonical or a meta noindex Google will not see the page (due to the robots.txt directive) and then not be able to act on the 301 or canonical or the meta noindex.
A meta noindex, because the spider crawls the page, will not only tell Google not to visit the page anymore, but also tells Google to remove the page from the index. This is key if you want the pages removed from the Google index.
The rule of thumb I use is that
-
If you have a page that is not in the Google index and you want to keep it out of the index put that file in robots.txt.
-
If you have a page that is in the Google index and you want it removed, then use the noindex meta tag, do not put it into the robots.txt for reasons mentioned above. Over time, once the pages are removed (and this may take a while depending on how often the page is cralwed) then you can put into robots.txt for good measure.
-
-
In order to exclude individual pages from search engine indices, **the noindex meta tag **is actually superior to robots.txt.
-
Noindex is good or robots deny
Whats the difference or can do both?
-
If they have pretty low content or do not add any value and is not searched by users too
Will be better to add noindex so as to have search engines crawl your site in a better way.
-
if those are generating a high bounce rate I would block them for search engines. The easiest way is probably by a robots.txt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
Sitemap.gz is being indexed and is showing up in SERP instead of actual pages.
Sitemap.gz is being indexed and is showing up in SERP instead of actual pages. I recently uploaded my sitemap file - https://psglearning.com/sitemapcustom/sitemap-index.xml - via Search Console. The only record within the XML file is sitemaps.gz. When I searched for some content on my site - here is the search https://goo.gl/mqxBeq - I was shown the following search result, indicating that our GZ file is getting indexed instead of our pages. http://www.psglearning.com/catalog 1 http://www.psglearning.com ...www.psglearning.com/sitemapcustom/sitemap.gz... 1 https://www.psglearning.com/catalog/productdetails/9781284059656/ 1 https://www.psglearning.com/catalog/productdetails/9781284060454/ 1 ... My sitemap is listed at https://psglearning.com/sitemapcustom/sitemap-index.xml inside the sitemap the only reference is to sitemap.gz. Should we remove the link the the sitemap.gz within the xml file and just serve the actual page paths? <sitemapindex< span=""> xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"></sitemapindex<><sitemap></sitemap>https://www.psglearning.com/sitemapcustom/sitemap.gz<lastmod></lastmod>2017-06-12T09:41-04:00
Technical SEO | | pdowling0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Changing the order of items on page against Google Terms & Conditions?
Good day, I am wondering if anybody here has done something like this before. I have a page in one of my sites that contains a number of different - but related - free resources. The resources can be sorted in different ways once the user is on the page. Now I am starting an outreach campaign, and want to be able to send out custom URLS (which pretty much means they have different query strings after them like '?id=123' ) so that when a person clicks on the link to the page it brings up the stuff they are more likely to be interested in at the top. I expect - hope - that some of these people will put links back to this page as a result of this. Now all the links may be slightly different, but they will come to the same page and the content will look slightly different. I will make sure to have the rel=canonical tag in place. Does anybody know if this would be in violation of Google Terms and Conditions. I can't see how, but I wanted to see what the experts here on Moz think before moving forward. Thanks in advance.
Technical SEO | | rayvensoft0 -
Getting Pages Indexed That Are Not In The Main Navigation
Hi All, Hoping you can help me out with a couple of questions I have. I am looking to create SEO friendly landing pages optimized for long tail keywords to increase site traffic and conversions. These pages will not live on the main navigation. I am wondering what the best way to get these pages indexed is? Internal text linking, adding to the sitemap? What have you done in this situation? I know that these pages cannot be orphaned pages and they need to be linked to somewhere. Looking for some tips to do this properly and to ensure that they can become indexed. Thanks! Pat
Technical SEO | | PatBausemer0 -
Google Page speed
I get the following advice from Google page speed: Suggestions for this page The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and 77.1KiB. http://www.irishnews.com/ http://www.irishnews.com/index.aspx I'm not sure how to fix this the default page is http://www.irishnews.com/index.aspx, anybody know what need to be done please advise. thanks
Technical SEO | | Liammcmullen0 -
Canonical - how can you tell if page is appearing duplicate in Google?
Our home page file is www.ides.com/default.asp and appears in Google as www.ides.com. Would it be a good thing for us to include the following tag in the head section of our website homepage?
Technical SEO | | Prospector-Plastics0