How to Block Urls with specific components from Googlebot
-
Hello,
I have around 100,000 Error pages showing in Google Webmaster Tools. I want to block specific components like com_fireboard, com_seyret,com_profiler etc.
Few examples:
I tried blocking using robots.txt. Just used this
Disallow: /com_fireboard/
Disallow: /com_seyret/But its not working. Can anyone suggest me to solve this problem.
Many Thanks
Shradda
-
I agree with Sha that your 404 page has a nice appearance. My main concern is it lacks functionality.
If I click on a link to your site and end up on that page, what is my next action? Likely I would hit the <back>button on my browser and leave your site. It is either that or typing a URL.</back>
I recommend you offer users the option to stay on your site. Your site navigation, a search box, some links, anything would be helpful.
-
Hi Shradda,
I agree with Ryan that the use of a meta noindex tag is the preferable way to block the pages, but obviously there may be difficulties with applying the tag, depending upon how your pages are generated and whether you are able to alter the code or not.
You can also use ?option=com_fireboard etc to create 301 redirects back to a higher order category page or search.
You should be able to use a single line of code to 301 all pages within each directory.
Using 301 redirects will also send a signal to search engines to de-index those pages.
Very clever 404 page too! Had to watch him go all the way across the page and back just so I knew I wasn't missing anything!
Sha
-
You can log into Google Webmaster Tools and adjust your parameter settings. It was designed for this exact purpose. Site Parameters > URL Parameters. If you use this solution, be sure to do the same in Bing WMT as well.
A better solution would be to noindex the pages. Using robots.txt should be avoided when possible.
If you do need to use robots.txt, your current disallow statement is set up to not crawl the folder named "com_fireboard". You intention is to not crawl the parameter ?option=com_fireboard. I know wildcards work for the trailing portion of a path but I have not tried them for the beginning part of the path.
I suggest you try the following:
Disallow: ?option=com_fireboard
For more on the robots.txt file, please view the following site: http://www.robotstxt.org/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use existing page with bad URL or brand new URL?
Hello, We will be updating an existing page with more helpful information with the goal of reaching more potential customers through SEO and also attaching a SEM campaign to the specific landing page. The current URL of the page scores 25 on Page Authority, and has 2 links to it from blog articles (PA 35, 31). The current content needs to be rewritten to be more helpful and also needs some additional information. The downsides are that it has an "bad" URL- no target keyword and uses underscores. Which of the following choices would you make? 1. Update this old "bad" URL with new content. Benefit from the existing PA. -or- 2. Start with a new optimized URL, reusing some of the old content and utilizing a 301 redirect from the previous page? Thank you!
Technical SEO | | XLMarketing0 -
Keyword Phrase in URL structure
Wondered the best URL structure, to include a major keyword phrase. Our clients' case is that their domain name is not the main keyword. So should we include the keyword phrase in the URL structure to list all their office locations: A - www.website.com/anxiety-treatment/denver/1001
Technical SEO | | ErnieB
or
B - www.website.com/denver/1001 Would this be considered keyword stuffing? We'd like "A" above to rank for keyword phrases related to "anxiety treatment denver", etc.0 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
In running a crawl of a client's site I can see several URLs listed in the sitemap that are then blocked in the robots.txt file. Other than perhaps using up crawl budget, are there any other negative implications?
Technical SEO | | richdan0 -
Canonical URL
Hi there Our website www.snowbusiness.com has a non www version and this one has 398 backlinks. What is the best way of transfering this link value if i establish the www. address as the canonical URL? Thanks, Ben
Technical SEO | | SnowFX0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
SEO URLs?
What are the best practices for generating SEO-friendly headlines? dashes between words? underscores between words? etc. Looking for a programatically generated solution that's using editor-written headlines to produce an SEO-friendly URL Thanks.
Technical SEO | | ShaneHolladay0 -
Crawl reveals hundreds of urls with multiple urls in the url string
The latest crawl of my site revealed hundreds of duplicate page content and duplicate page title errors. When I looked it was from a large number of urls with urls appended to them at the end. For example: http://www.test-site.com/page1.html/page14.html or http://www.test-site.com/page4.html/page12.html/page16.html some of them go on for a hundred characters. I am totally stymied, as are the people at my ISP and the person who talked to me on the phone from SEOMoz. Does anyone know what's going on? Thanks So much for any help you can offer! Jean
Technical SEO | | JeanYates0 -
Duplicate pages, overly dynamic URL’s and long URL’s in Magento
Hi there, I’ve just completed the first crawl of my Magento site and SEOMOZ has picked up 1,000’s of duplicate pages, overly dynamic URL’s and long URL’s due to the sort function which appends URL’s with variables when sorting products (e.g. www.example.com?dir=asc&order=duration). I’m not particularly concerned that this will affect our rankings as Google has stated that they are familiar with the structure of popular CMS’s and Magento is pretty popular. However it completely dominates my crawl diagnostics so I can’t see if there are any real underlying issues. Does anyone know a way of preventing this? Cheers,
Technical SEO | | WendyWuTours
Al.1