Getting error in webmasters
-
My site was running perfectly from last one year...
I don't know what happened now google is showing error while I am trying to use fetch option in webmasters.
-
Thanks everyone,
The issue was with my site cloudproxy firewall. I just disable it and now Google is indexing my articles again.
-
Errors indicated in Webmaster Tools are often inaccurate. Unfortunately, there is not much else to do than manually go through the errors and see for yourself whether there actually are problems. Maybe you want to compare indications in Webm.-Tools with other crawlers (such as Open Site Explorer).
-
Without site it is indeed difficult to assess the situation. You could try to do a crawl using Screaming Frog - after the crawl go to the tab 'Directives' - it will show you if there are issues with your settings (robots.txt / noindex tag / ...)
You could also check this question - http://moz.com/community/q/google-has-deindexed-40-of-my-site-because-it-s-having-problems-crawling-it - in this case the problem was caused by the gzip of the site.
rgds,
Dirk
-
Check the robots.txt and for any noindex tags in the head tag of the page. Otherwise, it's impossible to investigate without the site.
-
Hi Srinu,
What error is showing? Please expand you question.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console - Mobile Usability Errors
A site I'm looking at for a client had 100's of pages flagged as having Mobile Usability errors in Search Console. I found that the theme uses parameters in the URLs of some of theme resources (.js/.css) to identify the version strings. These were then being blocked by a rule in the robots.txt: "Disallow: /*?" I've removed this rule, and now when I inspect URLs and test the live versions of the page they are now being reported as mobile friendly. I then submitted validation requests in Search Console for both of the errors ("Text to small" and "Clickable Elements too close") My problem now, is that the validation has completed and the pages are still being reported as having the errors. I've double checked and they're find if I inspect them individually. Does anyone else have experience clearing these issues in Search Console? Any ideas what's going on here!
Technical SEO | | DougRoberts1 -
Error got removed by request
hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query
Technical SEO | | innovative.rohit0 -
How can I fix this home page crawl error ?
My website shows this crawl error => 612 : Home page banned by error response for robots.txt. I also did not get any page data in my account for this website ... I did get keyword rankings and traffic data, I am guessing from the analytics account. url = www.mississaugakids.com Not sure really what to do with this ! Any help is greatly appreciated.
Technical SEO | | jlane90 -
Importance of correction of technical errors
Hello everyone!!! I have question that i know it has been asked so many times. However i am looking for an idea for my specific situation. I own a website about commercial steel. My main focus has been getting incoming links from important companies and sites, while maintaining a good quality site. Ive been struggling with ranks and Page Authority. Ive never put attention to technical errors such as Duplicate Content, 4XX Errors and critical warnings such as Redirects. I have around 70 errors and around 400 warnings. Someone told me that as long as the website is "user friendly" i should worry about that. I have scarce resources to my SEO efforts. Which aspect should i put more effort?. Link Building and Quality Content vs Technical SEO ??? Is there a recommended balance mix towards a better PA, DA and Overall Quality?? I know is difficult, but it would be extremely helpful to hear from you!! Regards.
Technical SEO | | JesusD0 -
Help! Getting 5XX error
Keep getting a 5XX error and my site is obviously losing ranking, Asked the hoster. Nobody seems to know what is wrong. Site is www.monteverdetours.com I know this is probably an obvious problem and easy to fix but I don't know how to do it! Any comments will be greatly appreciated.
Technical SEO | | Llanero0 -
URL Error "NODE"
Hey guys, So I crawled my site after fixing a few issues, but for some reason I'm getting this strange node error that goes www.url.com/node/35801 which I haven't seen before. It appears to originate from user submitted content and when I go to the page it's a YouTube video with no video playing just a black blank screen. Has anyone had this issue before. I think it can probably just be taken off the site, but if it's a programming error of some sort I'd just like to know what it is to avoid it in the future. Thanks
Technical SEO | | KateGMaker0 -
Google Webmaster Tools: Keywords
Hi SEOmozzers! I'm the Dr./owner/in-house SEO for my eye care practice. The URL is www.ofallonfamilyeyecare.com. Our practice is in O'Fallon, MO. Since I'm an optometrist, my main keywords are "optometrist o'fallon" and "o'fallon optometrist". As I get more familiarity with SEO, Google Analytics and Webmaster Tools, I've discovered the Keywords that Google feels best represent my website. About a week ago I noted Google counted 21 instances of "optometrist" on the 28-30 pages of my website, which ranks as #32 in the most common keywords. #1 is "eye" with 506 instances. Even though 21 occurrences seemed low, I went though every page adding "optometrist" a couple times in the body where it would naturally be appropriate. I also added it to the address shown on the footer of every page. I changed the top navigation option of "meet Dr. Hegyi" to "our optometrist". I must have added at least 4 occurrences to every page on my site, and submitted for a re-crawl. I even tried to scale back the "eye" occurrences on a few pages. Today I see that Google has re-crawled the site and the keywords have been updated. "Optometrist has DROPPED from #32 to #33. Does anyone have any ideas or suggestions why I'm not seeing increased occurrence in Googles eyes? I realize this may not be a big factor in SERPs, but every bit of on-page optimization helps. Or is this too minor of an issue to sweat? Thanks!
Technical SEO | | JosephHegyi0 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1