Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Some URLs were not accessible to Googlebot due to an HTTP status error.
-
Hello I'm a seo newbie and some help from the community here would be greatly appreciated.
I have submitted the sitemap of my website in google webmasters tools and now I got this warning:
"When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted."
How do I fix this? What should I do?
Many thanks in advance.
-
You need to confirm that the URLs are in fact 100% of your URLs going into the site map are accessible.
if it's a big issue in a big site send me the URL in a private message I will use deep crawl to create a XML sitemap for you. The screaming frog tool is excellent as well though does performance well with extremely large sites.
check your robots.txt file this so great tool if in case you have more than one (it happens)
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
or
http://tools.seochat.com/tools/robots-txt-validator/
so many great free tools are found right here http://tools.seochat.com/tools/
It could be a number of things although it could be Google being finicky. Run the site through Moz crawler, use feedthebot.com using "tools SEO" or download the free version of http://www.screamingfrog.co.uk/seo-spider/ this will tell you if there is an issue. If your site is static you can even create an alternate site map with screaming frog if your site is large use deep crawl or Moz analytics
be certain there are no sitemaps redirecting to each other so no redirects going from the old site map to the new site map. Make certain that the site map is in an XML format e.g. http://example.com/sitemap.xml or if in a different format like https://example.com/sitemap_index.xml make sure the proper format That resolves when you look at the site map is what is going into Webmaster tools. Be certain the site map does not contain over 500 URLs per the site map so example.com/sitemap1.xml and so on keep numbering them appropriately. sometimes Google is overloaded and does not seem to like to play well with certain site maps or the site map is maybe not generating very well on the server and that is fixed later on. If this is a long-term problem speak to your host or developer. My recommendation is if you've done everything I have talked about that you attempt to submit is the sitemap to to Webmaster tools or simply build a new sitemap and submit that.
so if worse comes to worse take the screaming frog and use this URL to send it to Google
http://www.google.com/submityourcontent/business-owner/
I hope that helps,
Thomas
-
Hi, It looks like you have url's placed in your sitemap that have an HTTP status error. You can search for the urls and remove them from your sitemap or make sure they have the right status. Does it say which status error? And does it say which url's? Did you check those url's?When you use Screaming frog spider tool (free), you can search for status error's this is an easy way to find these url's.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same URL for languages sub-directories
Hi All, I have a main domain and 9 different subdirectories for languages, example: www.example.com/page.html www.example.com/uk/page-uk.html www.example.com/es/page-es.html we are implementing hreflang tags for the languages, but we are thinking to get rid of the dashes on the languages URL: -uk or -es, so it will be: www.example.com/page.html www.example.com/uk/page.html www.example.com/es/page.hrml would this be a problem? to have same page names even if they are in different subdirectories? would we need to add canonical tags, at lease for the main domain URLs? www.kornferry.com/page.html Thank you, Rachel
Technical SEO | | RaquelSaiz0 -
Backlinks that go to a redirected URL
Hey guys, just wondering, my client has 3 websites, 2 of 3 will be closed down and the domains will be permanently redirected to the 1 primary domain - however they have some high quality backlinks pointing the domains that will be redirected. How does this effective SEO? Domain One (primary - getting redesign and rebuilt) - not many backlinks
Technical SEO | | thinkLukeSEO
Domain Two (will redirect to Domain One) - has quality backlinks
Domain Three (will redirect to Domain One) - has quality backlinks When the new website is launched on Domain One I will contact the backlink providers and request they update their URL - i assume that would be the best.0 -
404 errors
Hi I am getting these show up in WMT crawl error any help would be very much appreciated | ?escaped_fragment=Meditation-find-peace-within/csso/55991bd90cf2efdf74ec3f60 | 404 | 12/5/15 |
Technical SEO | | ReSEOlve
| | 2 | mobile/?escaped_fragment= | 404 | 10/26/15 |
| | 3 | ?escaped_fragment=Tips-for-a-balanced-lifestyle/csso/1 | 404 | 12/1/15 |
| | 4 | ?escaped_fragment=My-favorite-yoga-spot/csso/5598e2130cf2585ebcde3b9a | 404 | 12/1/15 |
| | 5 | ?escaped_fragment=blog/c19s6 | 404 | 11/29/15 |
| | 6 | ?escaped_fragment=blog/c19s6/Tag/yoga | 404 | 11/30/15 |
| | 7 | ?escaped_fragment=Inhale-exhale-and-once-again/csso/2 | 404 | 11/27/15 |
| | 8 | ?escaped_fragment=classes/covl | 404 | 10/29/15 |
| | 9 | m/?escaped_fragment= | 404 | 10/26/15 |
| | 10 | ?escaped_fragment=blog/c19s6/Page/1 | 404 | 11/30/15 | | |0 -
:443 - 404 error
I get strange :443 errors in my 404 monitor on Wordpress https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
Technical SEO | | Happy-SEO
https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
https://www.compleetverkleed.nl:443/maskers/ I have no idea where these come from :S2 -
Redirect URLS with 301 twice
Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks
Technical SEO | | AL123al0 -
Special characters in URL
Hi There, We're in the process of changing our URL structure to be more SEO friendly. Right now I'm struggling to find a good way to handle slashes that are part of a targeted keyword. For example, if I have a product page and my product title is "1/2 ct Diamond Earrings in 14K Gold" which of the following URLs is the right way to go if I'm targeting the product title as the search keyword? example.com/jewelry/1-2-ct-diamond-earrings-in-14k-gold example.com/jewelry/12-ct-diamond-earrings-in-14k-gold example.com/jewelry/1_2-ct-diamond-earrings-in-14k-gold example.com/jewelry/1%2F2-ct-diamond-earrings-in-14k-gold Thanks!
Technical SEO | | Richline_Digital0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0