Pages to be indexed in Google
-
Hi,
We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages.
Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone.
My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate.
If we need to remove what needs to be done? Robots block or Noindex/Nofollow
Regards
-
Thank you Gagan!
-
Its a much better and clear explanation... +1 to it. Cheers !!
-
One key point on using robots.txt vs the meta tag noindex. It is not that the noindex meta tag is "superior" they just work differently.
If you use robots.txt - it will stop the spider from visiting that page, but it will not remove the page from the index. Also, if you have a page in robots.txt and on that page have a 301 redirect, or a canonical or a meta noindex Google will not see the page (due to the robots.txt directive) and then not be able to act on the 301 or canonical or the meta noindex.
A meta noindex, because the spider crawls the page, will not only tell Google not to visit the page anymore, but also tells Google to remove the page from the index. This is key if you want the pages removed from the Google index.
The rule of thumb I use is that
-
If you have a page that is not in the Google index and you want to keep it out of the index put that file in robots.txt.
-
If you have a page that is in the Google index and you want it removed, then use the noindex meta tag, do not put it into the robots.txt for reasons mentioned above. Over time, once the pages are removed (and this may take a while depending on how often the page is cralwed) then you can put into robots.txt for good measure.
-
-
In order to exclude individual pages from search engine indices, **the noindex meta tag **is actually superior to robots.txt.
-
Noindex is good or robots deny
Whats the difference or can do both?
-
If they have pretty low content or do not add any value and is not searched by users too
Will be better to add noindex so as to have search engines crawl your site in a better way.
-
if those are generating a high bounce rate I would block them for search engines. The easiest way is probably by a robots.txt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing a site from Google index with no index met tags
Hi there! I wanted to remove a duplicated site from the google index. I've read that you can do this by removing the URL from Google Search console and, although I can't find it in Google Search console, Google keeps on showing the site on SERPs. So I wanted to add a "no index" meta tag to the code of the site however I've only found out how to do this for individual pages, can you do the same for a entire site? How can I do it? Thank you for your help in advance! L
Technical SEO | | Chris_Wright1 -
Google Indexing Pages with Made Up URL
Hi all, Google is indexing a URL on my site that doesn't exist, and never existed in the past. The URL is completely made up. Anyone know why this is happening and more importantly how to get rid of it. Thanks 🙂
Technical SEO | | brian-madden0 -
My video sitemap is not being index by Google
Dear friends, I have a videos portal. I created a video sitemap.xml and submit in to GWT but after 20 days it has not been indexed. I have verified in bing webmaster as well. All videos are dynamically being fetched from server. My all static pages have been indexed but not videos. Please help me where am I doing the mistake. There are no separate pages for single videos. All the content is dynamically coming from server. Please help me. your answers will be more appreciated................. Thanks
Technical SEO | | docbeans0 -
Google is indexing blocked content in robots.txt
Hi,Google is indexing some URLs that i don't want to be indexed and also is indexing the same URLs with https. This URLs are blocked in the file robots.txt.I've tried to block this URLs through Google WebmasterTools but Google doesn't let me do it because this URL are httpsThe file robots.txt is correct so, what can i do to avoid this content to be indexed?
Technical SEO | | elisainteractive0 -
/index.php/ page
I was wondering if my system creates this page www my domain com/index.php/ is it better to block with robot.txt or just canonize?
Technical SEO | | ciznerguy0 -
Sending signals to Google to rank the correct page for a set of Keywords.
Hi All, Out of all our keywords their are 3 that are showing our home page in the serps rather than the specific product page URL on Google.co.za (Google.com ranks the correct URL) Im not sure why this is happening as most links built using the anchor text are pointing to the correct page. Why would google prefer ranking our home page on local search and rank the correct page on Google.com? (only 3 keywords have this problem) I have tried to correct this by creating links from strong internal pages with anchor text pointing to the correct URL. I have also concentrated on building links from .co.za domains using the anchor text and correct URL but to no avail. It has been 2 weeks now, since i tried to sort it out, but im not sure what else i can do to tell Google to rank the correct page. Any ideas? Regards Greg
Technical SEO | | AndreVanKets0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
Over 1000 pages de-indexed over night
Hello, On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages. I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200. The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages. If you could offer any solutions that would be greatly appreciated. Thanks, Robert.
Technical SEO | | 87ROB0