Best blocking solution for Google
-
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
-
Following up here -- did this answer Dave's question?
-
I would put noindex,follow on those page and wait a little until they disappear for Google index. Of course, if you have only a few pages, I would do it manually in GWT. If you have rather big site with a good crawl rate, this should be done in a few days.
When you don't see them anymore, you may use DISALLOW */beerbottles/ but this could be annoying later. I would recommend to use the meta robots as you have more control on it. It will allow page rank to flow in the beerbottles pages too !
-
I believe you can confirm the block via the webmaster tools also.
-
Hi Goodnewscowboy,
To block the whole folder you dont need to use the wild card (*)
and I advise you to also do these steps:
- Verify your ownership of the site in Webmaster Tools.
- On the Webmaster Tools home page, click the site you want.
- On the Dashboard, click Site configuration in the left-hand navigation.
- Click Crawler access, and then click Remove URL.
- Click New removal request.
- Type the URL of the page you want removed, and then click Continue. Note that the URL is case-sensitive—you will need to submit the URL using exactly the same characters and the same capitalization that the site uses.
- Select Remove page from cache only.
- Select the checkbox to confirm that you have completed the requirements listed in this article, and then clickSubmit Request.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Ecommerce Alerts
I recently started getting email notifications from Google re: new products on our websites. I am subscribed to Google alerts. Can anyone shed some light on this?
Technical SEO | | AMHC0 -
Google Update Frequency
Hi, I recently found a large number of duplicate pages on our site that we didn't know existed (our third-party review provider was creating a separate page for each product whether it was reviewed or not - the ones not reviewed are almost identical so they have been no indexed. Question - how long do you have to typically wait for Google to pick this up On our site? Is it a normal crawl or do we need to wait for the next Panda review (if there is such a thing)? Thanks much.
Technical SEO | | trophycentraltrophiesandawards0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Google Rejects Merchant Feed
Buon Giorno fromn 1 degrees C nearly dark & icy wetherby Uk... WTF Googles merchant centre has rejected my feed and In the time honoured zero customer service Ive grown accustomed to getting from Santa Clara County ive got bugger all idea how to fix it. Here is the feed:
Technical SEO | | Nightwing
http://ramsdensforcash.co.uk/sitefiles/handlers/googlemerchantdatafeed.ashx And here is the violation from the Google Gods themselves: http://ramsdensforcash.co.uk/sitefiles/handlers/googlemerchantdatafeed.ashx Anyone got any ideas why Google has given me the middle finger? Greazie tanto,
David0 -
Google Places for Local SEO
I am a webmaster at a company with over 50 clients, and I have to list the businesses of our clients in Google Places. Most of our clients are architecture agencies and construction companies, so they are unfamiliar with these things, and that's why I have to list their businesses on Google Places. It would be easier for me to manage all the places for these different businesses if I create the places with one gmail account. Can I use one gmail account to list the businesses for all our clients?
Technical SEO | | Arianittt2 -
Google rankings tanked....Now what?
We just experience a drop in Google rankings, some pretty harsh, across all of the keywords we have been ranking greater than 50. I’m a noob at SEO, but a technical noob so I started doing my home work. I’ve seen references to the “google dance” and “Honeymoon”, but this hit seems to have effected competitors too. Everyone seems re-ranked with several junk directories jumping up more than I think they should. Has anyone else seen this? Is this more Google algorithm adjustment or a natural settling based on our new SEO attempts? In either case, what should we do next? I know there is a holistic approach and everything is important however, we need bang for the buck at this point to before we start bleeding. One or two next steps? Our industry is residential cleaning and the site is www.bitabliss.com Here is a little history:
Technical SEO | | BitABliss
The site that’s been running for about 2 years. We initially put up a very basic “throw something up” site without much thought of SEO except for some basics and a long tail approach with a blog, FaceBook and Twitter. We launched an updated site on Feb 23. with new theme and this time some, “on page” work to better hit the basics. The site structure was kept the same and we added on some more localized content in hopes to take advantage of local searches. Also, enter SEOMoz to get us tracking things (Yay MOZ). Until yesterday, we had been doing pretty well in some of our target cites even with the more basic site. When we launched the new site focusing on page titles, descriptions and page content, and a few directory attempts. We started to see some incremental growth. It seemed to me that this kind of growth meant that we were doing the right things and doing a better job than some of the other sites. Any way, yesterday we got smacked down. This seems too harsh for a for the slow increases we have seen over the last month. Any thoughts you have would be great appreciated. Thanks! -Shawn1 -
Google Places and Name Change
Hello - I have a client who is a realtor and changed agencies. I edited their Google Places entry and the new name of their agency and address are showing - but so is their old listing. The agency they left is now trying to sue them for showing up in a number one position with Google Places under their agency name. Is this an indexing issue with Google? Their name shows up under both agency names. The corrected one shows most often, but the old one is still popping up on occasion. Thanks,
Technical SEO | | seoessentials1 -
Google dropping pages from SERPS
The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total. When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000. Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong? Website: www.pgs-plumbers.co.uk
Technical SEO | | guy_andrews0