Best blocking solution for Google
-
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
-
Following up here -- did this answer Dave's question?
-
I would put noindex,follow on those page and wait a little until they disappear for Google index. Of course, if you have only a few pages, I would do it manually in GWT. If you have rather big site with a good crawl rate, this should be done in a few days.
When you don't see them anymore, you may use DISALLOW */beerbottles/ but this could be annoying later. I would recommend to use the meta robots as you have more control on it. It will allow page rank to flow in the beerbottles pages too !
-
I believe you can confirm the block via the webmaster tools also.
-
Hi Goodnewscowboy,
To block the whole folder you dont need to use the wild card (*)
and I advise you to also do these steps:
- Verify your ownership of the site in Webmaster Tools.
- On the Webmaster Tools home page, click the site you want.
- On the Dashboard, click Site configuration in the left-hand navigation.
- Click Crawler access, and then click Remove URL.
- Click New removal request.
- Type the URL of the page you want removed, and then click Continue. Note that the URL is case-sensitive—you will need to submit the URL using exactly the same characters and the same capitalization that the site uses.
- Select Remove page from cache only.
- Select the checkbox to confirm that you have completed the requirements listed in this article, and then clickSubmit Request.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Penalty and Adwords
Hi guys, I am wondering if the google manual penalty or penalty in general (because of bad backlink profile) also means that your website is blocked for Adwords? Thanks
Technical SEO | | barobijav0 -
Google serp pagination issue
We are a local real estate company and have landing pages for different communities and cities around our area that display the most recent listings. For example: www.mysite.com/wa/tumwater is our landing page for the city of Tumwater homes for sale. Google has indexed most of our landing pages, but for whatever reason they are displaying either page 2, 3, 4 etc... instead of page 1. Our Roy, WA landing page is another example. www.mysite.com/wa/roy has recently been showing up on page 1 of Google for "Roy WA homes for sale", but now we are much further down and www.mysite.com/wa/roy?start=80 (page 5) is the only page in the serps. (coincidentally we no longer have 5 pages worth of listings for this city, so this link now redirects to www.mysite.com/wa/roy.) We haven't made any major recent changes to the site. Any help would be much appreciated! *You can see what my site is in the attached image... I just don't want this post to show up when someone google's the actual name of the business 🙂 nTTrSMx.jpg C4mhfgh.jpg
Technical SEO | | summithomes0 -
Blocking subdomains without blocking sites...
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com. Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load? This is a simplification of a problem I am running across. Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites). Thoughts?
Technical SEO | | SL_SEM1 -
Wordpress Blog Blocked by Metarobots
Upon receiving my first crawl report from new pro SEOMoz acc (yaay!) I've found that the wordpress blog plugged into my site hasn't been getting crawled due to being blocked by metarobots. I'm not a developer and have very little tech expertise, but a search dug up that the issue stemmed from the wordpress site settings > privacy > Ask search engines not to index this site option being selected. On checking the blog "Allow search engines to index this site" was selected so I'm unsure what else to check. My level of expertise means I'm not confident going into the back end of the site and I don't have a tech guy on site to speak to. Has anyone else had this problem? Is it common and will I need to consult a developer to get this fixed? Many thanks in advance for your help!
Technical SEO | | paj19790 -
Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
Hi guys, I have a client who has 3 websites all for different locations in the same state in Australia. Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area. He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon. My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area. Also this way we have 1 domain to optimise and build domain authority for. Has anyone else come across this and is my solution the best for this? Thanks! Jon
Technical SEO | | Jon_bangonline0 -
Remove Site from Google
How can I get my website out of google? I want all pages completely gone. Thanks!
Technical SEO | | tylerfraser0 -
About Google Spider
Hello, people! I have some questions regarding on Google spider. Many people are saying that "Google spiders only have US IP address." Is this really true? But I also saw video from Google's offical blog and it said "Google spider come from all around the world." At this point I am really confused. Q1) I researched and it seems like Google spiders have only US IP address. THen what does exactly mean by "Google spider come from all around the world."? Q2) If Google spider have only US IP address, what happen to site which use IP delivery? Is this means that Google spider always redirect to us site since they only have US IP? Can anyone help me to understand?? One more questions! When Google analyzing for cloaking issue, do you think Google analyze when spider crawls the site or after they crawled the site?
Technical SEO | | Artience0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70