How can I get unimportant pages out of Google?
-
Hi Guys,
I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages?
Thanks!
Ramon
-
If you want to remove an entire directory, you can exclude that directory in robots.txt, then go to Google Webmaster Tools and request a URL removal. You'll have an option to remove an entire directory there.
-
No, sorry. What I said is, if you mark the folder as disalow in robots.txt, it will not remove the pages are already indexed.
But the meta tag, when the spiders go again on the page and see that the pages are with the noindex tag will remove it.
Since you can not already include the directory on the robots.txt. Before removing the SE pages.
First you put the noindex tag on all pages you want to remove. After they are removed, it takes a week for a month. After you add the folders in robots.txt to your site who do not want to index.
After that, you dont need to worry about the tags.
I say this because when you add in the robots.txt first, the SE does not read the page anymore, so they would not read the meta noindex tag. Therefore you must first remove the pages with noindex tag and then add in robot.txt
Hope this has helped.
João Vargas
-
No, sorry. What I said is, if you mark the folder as disalow in robots.txt, it will not remove the pages are already indexed.
But the meta tag, when the spiders go again on the page and see that the pages are with the noindex tag will remove it.
Since you can not already include the directory on the robots.txt. Before removing the SE pages.
First you put the noindex tag on all pages you want to remove. After they are removed, it takes a week for a month. After you add the folders in robots.txt to your site who do not want to index.
After that, you dont need to worry about the tags.
I say this because when you add in the robots.txt first, the SE does not read the page anymore, so they would not read the meta noindex tag. Therefore you must first remove the pages with noindex tag and then add in robot.txt
Hope this has helped.
João Vargas
-
Thanks Vargas, If I choose for noindex, I should remove it from the robot.txt right?
I understood that if you have a noindex tag on the page and as well a dissallow in the robot.txt the SE will index it, is that true?
-
For you remove the pages you want, need to put a tag:
<meta< span="">name="robots" content="noindex">If you want internal links and external relevance to pass on these pages, you put:
<meta< span="">name="robots" content="noindex, follow">If you do the lock on robot.txt: only need to include the tag in the current urls, new search engines will index no.
In my opinion, I do not like using the google url remover. Because if someday you want to index these folders, will not, at least it has happened to me.
The noindex tag works very well to remove objectionable content, within 1 month or so now will be removed.</meta<></meta<>
-
Yes. It's only a secondary level aid, and not guaranteed, yet it could help speed up the process of devaluing those pages in Google's internal system. If the system sees those, and cross-references to the robots.txt file it could help.
-
Thanks guys for your answers....
Alan, do you mean that I place the tag below at all the pages that I want out of Google? -
I agree with Alan's reply. Try canonical 1st. If you don't see any change, remove the URLs in GWT.
-
There's no bulk page request form so you'd need to submit every URL one at a time, and even then it's not a guaranteed way. You could consider gettting a canonical tag on those specific pages that provides a different URL from your blog, such as an appropriate category page, or the blog home page. That could help speed things up, but canonical tags themselves are only "hints" to Google.
Ultimately it's a time and patience thing.
-
It will take time, but you can help it along by using the url removal tool in Google Webmaster Tools. https://www.google.com/webmasters/tools/removals
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is site: a reliable method for getting full list of indexed pages?
The site:domain.com search seems to show less pages than it used to (Google and Bing). It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed. Should I now accept the "pages" listed in site: is an unreliable metric?
Technical SEO | | bjalc20112 -
How to get google to forget my old but still working page and list my new fully optimized page for a keyword?
Hi There! (i am beginner in seo) I have dynamic and static pages on our site. I created a static page for a specific keyword. Fully optimized it, (h1, alt, metas, etc.....maybe too optimized). My problem is that this page is alive for weeks, checked it in GWT and it is in robots.txt, google sees it, and indexed it. BUT whenewer i do a search for that keyword, we still appear with the dynamically created link in the google listings. How could i "redirect" google, if sy make a search for that keyword than shows our optimized page? Is there a tool for that? I cant delete the dynamic page... Any ideas? Thx Andrew
Technical SEO | | Neckermann0 -
Is There Google 6th page penalty?
My site have keyword domain but my page doesnt up or down at 6th page on search results. And my main page doesnt show 6th page too my alt pages. So what can i do for this penaly? Thanks for your help
Technical SEO | | iddaasonuclari0 -
How can I see the SEO of a URL? I need to know the progress of a specific landing-page of my web. Not a keyword, an url please. Thanks.
I need to know the evolution on SEO of a specific landing-page (an URL) of my web. Not a keyword, a url. Thanks. (Necesito saber si es posible averiguar el progreso de una URL específica en el posicionamiento de Google. Es decir, lo que hace SEOmoz con las palabras clave pero al revés. Yo tengo una url concreta que quiero posicionar en las primeras posiciones de Google pero quiero ver cómo va progresando en función a los cambios que le voy aplicando. Muchas gracias)
Technical SEO | | online_admiral0 -
Can you have multiple rich snippets show up for the same page
Is it possible to have multiple rich snippets show up in the SERPs for the same page?For example, could a product page have both the aggregate review rich snippet and also the author thumbnail?
Technical SEO | | ProjectLabs0 -
Can someone help me get this site ranked? www.2sponsors.com
Hi, I am have been trying for months to get a site ranked for one of my customers and I am not doing very well. I have been doing SEO for years and have gotten lots of sites ranked but this one has been the most difficult. Does anyone have time to look at it for me? Thanks The sites PR=4. I am trying to get it ranked in www.google.com.ar Thanks Carla skype: carla.dawson78
Technical SEO | | Carla_Dawson0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490