Google webmaster tools says access denied for 77 urls
-
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out.
The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below
Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site.
the responce code is 403
here is a couple of examples
http://www.in2town.co.uk/Entertainment-Magazine
http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone
i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them
any help would be great
-
Yup, deleted.
-
I have now deleted the old version can you check on this and make sure you can no longer see it.
-
You have a fairly complex .htaccess file (hint: I looked up your OLD .htaccess file - you should delete old htaccess files or something so people can't access them via a web browser), so I'm guessing the problem will be within your .htaccess file.
If possible, put a plain and simple .htaccess file on, test it with Google Webmaster Tools and see if the error still persists.
hi thanks for that. i will delete the old one now
-
In Webmaster Tools, you can "fetch as google bot" meaning you can enter one of those 77 URLs, and see what the Google "bot" sees when going to that URL.
You can also use:
http://www.dnsqueries.com/en/googlebot_simulator.php
For the URL: http://www.in2town.co.uk/Entertainment-Magazine
the Google Bot Simulator says:
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Showbiz-Gossip
and for: http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Weight-Loss-Hypnotherapy
Interestingly, both the NEW URLs work fine although http://www.in2town.co.uk/Weight-Loss-Hypnotherapy doesn't look too good (at least in my web browser) but that's another issue.
You have a fairly complex .htaccess file (hint: I looked up your OLD .htaccess file - you should delete old htaccess files or something so people can't access them via a web browser), so I'm guessing the problem will be within your .htaccess file.
If possible, put a plain and simple .htaccess file on, test it with Google Webmaster Tools and see if the error still persists.
Adam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Clarification on indexation of XML sitemaps within Webmaster Tools
Hi Mozzers, I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending". On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data: 13/08 - Submitted=0 Indexed=0
Technical SEO | | Silkstream
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498 Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th? Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps? Thanks in advance for any help.0 -
Sitelink demotion not working after submitting in Google webmaster tool
Hello Friends, I have a question regarding demotion of sitelinks in Google webmaster tool. Scenario: I have demoted one of the sitelink for my website two months back; still the demoted sitelink has not been removed from the Google search results.May I know any reason, why this page is not getting removed even after demoting from GWT? If we resubmit the same link in demotion tool one more time, will it work? Can anybody help me out with this? Note: Since the validly of demotion exists only for 3 months (90 days), I am concerned about the same.
Technical SEO | | zco_seo0 -
Marketing URL
Hi, I need a bit of advice on marketing URL's. The destinations URL is http://www.website.com/by-development.php?area=Isle Of Wight&development=developmentname. If we wanted to use www.website.com/developmentname on literature to send people to the ugly URL above, what would we do? Would we need to rewrite the ugly URL to the neat and then 301 the ugly to the neat? Currently, the team are using a new domain of neatandrelevant.info and 301 redirecting it to ugly URL but there are lots of different developments they want to send people to so a new domain is bought for each development which seems a bit unnecessary. They point to different pages on the ugly URL website. Assuming canonical tag would not be needed then because the ugly URL page would be redirected. Also, as the website has ugly URL's anyway, would it not be best practice to use rewrites anyway so that the URL's read www.mywebsite.com/region/development? Would it confuse things to then have extra short marketing URL's missing out /region? Hope that makes sense....
Technical SEO | | Houses0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
Google webmaster errors
**If you know what these google webmasters errors mean, and you can explain it to me in simple english and tell me how I can locate the problem, I would really appreciate it!. <colgroup><col width=""><col width=""><col width=""><col width=""><col width="*"><col width="124"><col width="54"></colgroup>
Technical SEO | | Joseph-Green-SEO
| | | | | Server error | | | | Soft 404 | | | | Access denied | | Not found | | | Not followed | | | |** I have many of these errors, is it harming SEO?Yoseph0 -
Long URL
I am using seomoz software as a trial, it has crawled my site and a report is telling me that the URL for my forum is to long: <dl> <dt>Title</dt> <dd>Healthy Living Community</dd> <dt>Meta Description</dt> <dd>Healthy life discussion forum chatting about all aspects of healthy living including nutrition, fitness, motivation and much more.</dd> <dt>Meta Robots</dt> <dd>noodp, noydir</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> <dd> 1 Warning Long URL (> 115 characters) Found about 17 hours ago <dl> <dt>Number of characters</dt> <dd>135 (over by 21)</dd> <dt>Description</dt> <dd>A good URL is descriptive and concise. Although not a high priority, we recommend a URL that is shorter than 75 characters.</dd> </dl> </dd> <dd> URL: http://www.goodhealthword.com/forum/reprogramming-health/welcome-to-the-forum-for-discussing-the-4-steps-for-reprogramming-ones-health/ The problem is when I check the page via edit or in the admin section of wordpress, the url is a s follows: http://www.goodhealthword.com/forum/ My question is where is I cannot see where this long url is located, it appears to be a valid page but I cant find it. Thanks Pete </dd> </dl>
Technical SEO | | petemarko0 -
Google Penalize?
Hello, I read an statement somewhere which stated: "2 identical URLs linked to 2 different popular key phrases next to each other (on the same website/domain) will lead to a Google penalize. Google knows, that both terms are popular. This means, Google will ignore the links to your site (you'll not have any benefit) and the site you have your links on loses authority." What are your thoughts on this statement? Thank you.
Technical SEO | | micfo0