Google has deindexed 40% of my site because it's having problems crawling it
-
Hi
Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper).
The site i'm talking about is http://www.gazetaexpress.com/
We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't?
In the screenshot attached to this post you will see how Google Webmasters is reporting these errors.
In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem?
If you need more details feel free to ask. I will appreciate any help.
Thank you in advance
-
Great news - strange that these 608 errors didn't appear while crawling the site with Screaming Frog.
-
We found the problem. It was about website compression (GZIP). I found this after crawling my site with Moz, and saw lot's of pages with 608 Error code. Then i searched in Google and saw a response by Dr. Pete in another question here in Moz Q/A (http://moz.com/community/q/how-do-i-fix-608-s-please)
After we removed the GZIP, Google could crawl the site with no problems.
-
Dirk
Thanks a lot for your help. Unfortunately the problem remains the same. More than 65% of site has been de-indexed and it's making our work very difficult.
I'm hoping that somebody here might have any idea of what is causing this so we can find a solution to fix it.
Thank you all for your time.
-
Hi
Not sure if the indexing problem is solved now, but I did a few other checks. Most of the tools I used where able to capture the problem url without much issues even from California ip's & simulating Google bot.
I noticed that some of the pages (example http://www.gazetaexpress.com/fun/) are quite empty if you browse them without Javascript active. Navigating through the site with Javascript is extremely slow, and a lot of links don't seem to respond. When trying to go from /fun/ to /sport/ without Javascript - I got a 504 Gateway Time-out
Normally Google is now capable of indexing content by executing the javascript, but it's always better to have a non-javascript fallback that can always be indexed (http://googlewebmastercentral.blogspot.be/2014/05/understanding-web-pages-better.html) - the article states explicitly
- If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
This could be the reason for the strange errors when trying to fetch like Google.
Hope this helps,
Dirk
-
Hi Dirk
Thanks a lot for your reply.
Today we turned off the firewall for a couple hours and tried to fetch the site as Google. It didn't work. The results we're the same as before.
This problem is starting to be pretty ugly since Google has started now not showing our mobile results as 'mobile-friendly' even though we have a mobile version of site, we are using rel=canonical and rel=alternate and 302 redirects for mobile users from desktop pages to mobile ones when they are browsing via smartphone.
Any other idea what might be causing this?
Thanks in advance
-
Hi,
It seems that you're pages are extremely heavy to load - I did 2 tests - on your homepage & on the /moti-sot page
Your homepage needed a whopping 73sec to load (http://www.webpagetest.org/result/150312_YV_H5K/1/details/) - the moti-sot page is quicker - but 8sec is still rather high (http://www.webpagetest.org/result/150312_SK_H9M/)
I sometimes noticed a crash of the Shockwave flash plugin, but not sure if this is related to your problem;I crawled your site with Screaming Frog, but it didn't really find any indexing problems - while you have a lot of pages very deep in your sitestructure, the bot didn't seem to have any specific troubles to access your page. Websniffer returns a normal 200 code when checking your sites - even with useragent "Google"
So I guess you're right about the firewall - may be it's blocking the ip addresses used by Google bot - do you have reporting from the firewall which traffic is blocked? Try to search for the useragent Googlebot in your logfiles and see if this traffic is rejected. The fact that some sections are indexed and others not could be related to the configuration of the firewall, and/or the ip addresses used by Google bot to check your site (the bot is not always using the same ip address)
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Should I worry about these 404's?
Just wondering what the thought was on this. We have a site that lets people generate user profiles and once they delete the profile the page then 404's. I was told there is nothing we can do about those from our developers, but I was wondering if I should worry about these...I don't think they will affect any of our rankings, but you never know so I thought I would ask. Thanks
Technical SEO | | KateGMaker1 -
Product landing page URL's for e-commerce sites - best practices?
Hi all I have built many e-commerce websites over the years and with each one, I learn something new and apply to the next site and so on. Lets call it continuous review and improvement! I have always structured my URL's to the product landing pages as such: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/top-category/sub-category/product-name Now this has always worked fine for me but I see more an more of the following happening: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/product-name Now I have read many believe that the longer the URL, the less SEO impact it may have and other comments saying it is better to have the just the product URL on the final page and leave out the categories for one reason or another. I could probably spend days looking around the internet for peoples opinions so I thought I would ask on SEOmoz and see what other people tend to use and maybe establish the reasons for your choices? One of the main reasons I include the categories within my final URL to the product is simply to detect if a product name exists in multiple categories on the site - I need to show the correct product to the user. I have built sites which actually have the same product name (created by the author) in multiple areas of the site but they are actually different products, not duplicate content. I therefore cannot see a way around not having the categories in the URL to help detect which product we want to show to the user. Any thoughts?
Technical SEO | | yousayjump0 -
Trying to get google to know my site is a magazine site is this wrong
Hi, i have put a line to describe what my site is at the top of my site and i want to know if this is wrong or not. We have dropped frok being number one in google for lifestyle magazine to now number seven. Before we had to redo our site we were number one and then we dropepd to around number four when we finished the site and now we are number seven and i need to try and get back up there. To help google know we are a lifestyle magazine i have put a line at the top of the site and i want to know if this looks out of place and if i should take it down. i need advice on how to get google to know we are a lifestyle magazine and get back in the top five of google my site is www.in2town.co.uk any help would be great
Technical SEO | | ClaireH-1848860 -
Google Webmaster Site Performance
In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance
Technical SEO | | Bucky0 -
Crawl Tool Producing Random URL's
For some reason SEOmoz's crawl tool is returning duplicate content URL's that don't exist on my website. It is returning pages like "mydomain.com/pages/pages/pages/pages/pages/pricing" Nothing like that exists as a URL on my website. Has anyone experienced something similar to this, know what's causing it, or know how I can fix it?
Technical SEO | | MyNet0 -
Canonicalization isn't consistent across site!?!
I started managing a fairly small site that consists of a home page, flash portfolio, and a wordpress blog. The home page ( main index ) is canonicalized as: The wordpress blog is canonicalized as Does canonicalization need to be consistent across the site? Could the difference in canonicalization cause any ranking problems, and or indexing problems for the blog / entire site? Any thoughts are appreciated!
Technical SEO | | SEOProPhoto0 -
Google has not indexed my site in over 4 weeks, what's the problem?
We recently put in permanent redirects to our new url, but Google seems to not want to index the new url. There was no problems with the old url and the new url is brand new so should have no 'black marks' against it. We have done everything we can think off in terms of submitting site maps, telling google our url has changed in webmaster tools, mentioning the new url on social sites etc...but still nothing. It has been over 4 weeks now since we set up the redirects to the url, any ideas why Google seems to be choosing not to index it? Thanks
Technical SEO | | cewe0