How can I get unimportant pages out of Google?
-
Hi Guys,
I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages?
Thanks!
Ramon
-
If you want to remove an entire directory, you can exclude that directory in robots.txt, then go to Google Webmaster Tools and request a URL removal. You'll have an option to remove an entire directory there.
-
No, sorry. What I said is, if you mark the folder as disalow in robots.txt, it will not remove the pages are already indexed.
But the meta tag, when the spiders go again on the page and see that the pages are with the noindex tag will remove it.
Since you can not already include the directory on the robots.txt. Before removing the SE pages.
First you put the noindex tag on all pages you want to remove. After they are removed, it takes a week for a month. After you add the folders in robots.txt to your site who do not want to index.
After that, you dont need to worry about the tags.
I say this because when you add in the robots.txt first, the SE does not read the page anymore, so they would not read the meta noindex tag. Therefore you must first remove the pages with noindex tag and then add in robot.txt
Hope this has helped.
João Vargas
-
No, sorry. What I said is, if you mark the folder as disalow in robots.txt, it will not remove the pages are already indexed.
But the meta tag, when the spiders go again on the page and see that the pages are with the noindex tag will remove it.
Since you can not already include the directory on the robots.txt. Before removing the SE pages.
First you put the noindex tag on all pages you want to remove. After they are removed, it takes a week for a month. After you add the folders in robots.txt to your site who do not want to index.
After that, you dont need to worry about the tags.
I say this because when you add in the robots.txt first, the SE does not read the page anymore, so they would not read the meta noindex tag. Therefore you must first remove the pages with noindex tag and then add in robot.txt
Hope this has helped.
João Vargas
-
Thanks Vargas, If I choose for noindex, I should remove it from the robot.txt right?
I understood that if you have a noindex tag on the page and as well a dissallow in the robot.txt the SE will index it, is that true?
-
For you remove the pages you want, need to put a tag:
<meta< span="">name="robots" content="noindex">If you want internal links and external relevance to pass on these pages, you put:
<meta< span="">name="robots" content="noindex, follow">If you do the lock on robot.txt: only need to include the tag in the current urls, new search engines will index no.
In my opinion, I do not like using the google url remover. Because if someday you want to index these folders, will not, at least it has happened to me.
The noindex tag works very well to remove objectionable content, within 1 month or so now will be removed.</meta<></meta<>
-
Yes. It's only a secondary level aid, and not guaranteed, yet it could help speed up the process of devaluing those pages in Google's internal system. If the system sees those, and cross-references to the robots.txt file it could help.
-
Thanks guys for your answers....
Alan, do you mean that I place the tag below at all the pages that I want out of Google? -
I agree with Alan's reply. Try canonical 1st. If you don't see any change, remove the URLs in GWT.
-
There's no bulk page request form so you'd need to submit every URL one at a time, and even then it's not a guaranteed way. You could consider gettting a canonical tag on those specific pages that provides a different URL from your blog, such as an appropriate category page, or the blog home page. That could help speed things up, but canonical tags themselves are only "hints" to Google.
Ultimately it's a time and patience thing.
-
It will take time, but you can help it along by using the url removal tool in Google Webmaster Tools. https://www.google.com/webmasters/tools/removals
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website's Home Page is Missing on Google SERP
Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers
Technical SEO | | ugorayan0 -
Google indexes page elements
Hello We face this problem that Google indexes page elements from WordPress as single pages. How can we prevent these elements from being indexed separately and being displayed in the search results? For example this project: www.rovana.be When scrolling down the search results, there are a lot of elements that are indexed separately. When clicking on the link, this is wat we see (see attachements) Does anyone have experience with this way of indexing and how can we solve this problem? Thanks! LlAWG4w.png C7XDDYS.png gVroomx.png
Technical SEO | | conversal0 -
Local SEO: Spain - having trouble getting to first page
Dear, Moz community We are an online advertising website / direcotry. Lately we discovered that our website due to technical error was in English rather than Spanish and was not ranking in the local Spanish google.es at all. After changing the main language we quickly climbed to 2nd page with rankings in google.es . **My question is on top of regular SEO (Link building, content, blog) what could we do to help us rank quicker in local google.es. ** The same keyword ranks in google.com as #3 but not in Spain. Should we put in the effort to work with local directories? Google MyBusiness is not an option as we are not a bricks and mortar business.
Technical SEO | | advertisingcloud0 -
Structured data and Google+ Local business page are conflicting
Hi, A few (almost 8 now) months ago we have added structured data to our website. which according to the testing tool should work. (Our url: https://www.rezdy.com) However when searching for our company name, our old local business page from Google+ shows up. I have reached out to google to tell them that we aren't a local business anymore and want the data from the page to be removed. But this all takes painfully long. I want my search result to be shown like the large businesses (examples: Adroll, Hubspot), including logo, twitter feed etc. etc. Will this all work, if so, is there a way to speed up the process, any suggestions?
Technical SEO | | Niek_Dekker1 -
Wrong page ranked in Google, specific example
Hi All, I've searched for previous questions and many talk about the same problem but do not post an actual example. I am also thinking to do a blog post and a series of experiments once there is a theory. My target keyword is "Exhibition Stand Hire" and this is the target page on our site http://goo.gl/qt54lb Site appears on page 6 of SERPS (google.co.uk), but instead of this page a homepage is listed. But if I'm searching for the term using quotes, ie "Exhibition Stand Hire" the right page appears on page 4 of the SERPs. Our home page only uses the keyword in the body text, while target page is very optimised. Could it be over-optimised? I've tried mixing up words in the title tag to not offer an exact match, also i've varied the anchor text of all incoming links but that didn't fix the problem. (Hence why at the moment they all use different terms to point to this page) None of this helped alter what page is chosen to appear. Is it simply the matter of page not being strong enough compared to other less relevant pages on the site? How come many other sites rank better with much less effort? (i'm using OSE to determine competition) Thank you.
Technical SEO | | georgexx0 -
Can iFrames count as duplicate content on either page?
Hi All Basically what we are wanting to do is insert an iframe with some text on onto a lot of different pages on one website. Does google crawl the content that is in an iFrame? Thanks
Technical SEO | | cttgroup0 -
Google sees 2 home pages while I only have 1
How to solve the problem of google seeing both domain.com and domain.com/index.htm when I only have one file? Will the cannonical work? If so which? Or any other solutions for a novice? I learned from previous blogs that it needs to be done by hosting service, but Yahoo has no solution.
Technical SEO | | Kurtyj0 -
Google is keeping very old title tags in the SERPs for my site. How can I fix this?
Hi Around 6 months ago a site I work with changed its brand. One company became two. Despite changing the title when a new site went live around 6 months ago Google still picks up the old title for certain search results relevant to the old title. When a search result is relevant to the new title it shows that. It's very frustrating as we are trying to re-brand and do not want the old brand name showing for some very important search results. Thanks in advance for your help Paul
Technical SEO | | pauldoffman0