HTTP Errors in Webmaster Tools
-
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
-
Hey Sha,
Sorry for the slow reply. Yes, I checked the URLs that were listed and I didn't actually receive a 403 error. I guess I figured there would be some room for lag time but these errors are over a month old so I thought that was strange.
-
Hi Kyle,
Have you manually checked the URL's reported in Webmaster Tools to check whether they are actually returning a 403 for users? I have often seen errors that were temporary or have already been fixed still being reported in WMT for sometime afterwards.
If you have checked and the pages are working fine then the problem is most likely due to a lag in updates in WMT so you can ignore it and wait for it to go away.
If you have checked manually and find that there is in fact a 403 problem on the site, then you will need to investigate further. Since you are relating the sudden flood of 403's to the addition of the 301 redirect, the first thing I would do is check that the code for your 301 is correct.
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Speed Testing Tools For Production Sites
Hi Guys, Any free site speed testing tools for sites in production, which are password protected? We want to test site speed before the new site goes live on top priority pages. Site is on Shopify – we tried google page insights while being logged into the production site but believe its just recording the speed of the password page. Cheers.
Intermediate & Advanced SEO | | brandonegroup1 -
Does non-critical AMP errors prevent you from being featured on Top Stories Carousel?
Consider site A which is a news publishing site that has valid AMP pages with non-critical AMP pages (as notified within Search Console). Also, Site A publishes news articles from site B (its partner site) and posts it on site A which have AMP pages too but most of them are not valid AMP pages with critical AMP errors. For brand terms like Economic Times, it does show a top stories carousel for all articles published by Economic Times, however it doesn't look the same for site A (inspite of it having valid AMP pages). Image link: http://tinypic.com/r/219bh9j/9 Now that there are valid AMP pages from site A and invalid AMP pages from site B on site A, there have been instances wherein a news article from site A features on the top stories carousel on Desktop for a certain query whereas it doesn't feature on the mobile SERPs in spite of the page being a valid AMP page. For example, as mentioned in the screenshot below: Business Today ranks on the Top Stories carousel for a term like “jio news” on Desktop, but on Mobile although the page is a valid AMP page, it doesn’t show as an AMP page within the Top Stories Carousel. Image Link: http://tinypic.com/r/11sc8j6/9 There have been some cases where although the page is featured on the top carousel on desktop, the same article doesn't show up on the mobile version for the same query on the Top Stories Carousel. What could be the reason behind this? Also, would it be necessary to solve both critical and non-critical errors on site A (including those published from site B on site A)?
Intermediate & Advanced SEO | | Starcom_Search1 -
Robot.txt error
I currently have this under my robot txt file: User-agent: *
Intermediate & Advanced SEO | | Rubix
Disallow: /authenticated/
Disallow: /css/
Disallow: /images/
Disallow: /js/
Disallow: /PayPal/
Disallow: /Reporting/
Disallow: /RegistrationComplete.aspx WebMatrix 2.0 On webmaster > Health Check > Blocked URL I copy and paste above code then click on Test, everything looks ok but then logout and log back in then I see below code under Blocked URL: User-agent: * Disallow: / WebMatrix 2.0 Currently, Google doesn't index my domain and i don't understand why this happening. Any ideas? Thanks Seda0 -
How concerning is a message from Google about an increase in server errors?
In the past few weeks I have started getting messages from Google webmasters about an increase in server errors. According to our r&d team these messages come at times our site has been down and Google is not an accurate measure of the site health. 1 - are they correct and is there a better tool to be using? 2 - could be harmed that Google is occasionally running into this problem..that is then fixed within a few hours? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Using "Read More" buttons as a tool to cram in Content
Hi Mozzers! Let's say our website is clean, professional, and minimalistic. Can we use a "read more" button that will expand the text on the page to increase the amount of content while (unless clicked) not impacting the appearance? I want to make sure I am not violating Google Webmaster's guidelines for "Hidden Text" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Question regarding error url while checking in open-site explorer tool
Hello friends, My website home url & inner page url shows error while checking the open-site site explorer tool from SEOMoz, for a website** eg.www.abc.com website as**
Intermediate & Advanced SEO | | zco_seo
**"Oh Hey! It looks like that URL redirects to www.abc.com/error.aspx?aspxerrorpath=/default.aspx. Would you like to see data for that URL instead?"**May I know the reason, why this url showing this result while checking back link report from the tool?**May I know on what basis this tool is evaluating the website url as well?****May I know, Will this affect the Google SERPs for this website?**Thanks0 -
Should I report unnatural links via Webmasters?
We have a client who fired their last SEO firm for backlinking. The company has the actual emails and evidence that it found. On July 19, 2012, they received a notice in Webmasters that "unnatural links" had been detected to their site. The notice states that they should request reinclusion, but Matt Cutts is saying something different: https://plus.google.com/u/3/109412257237874861202/posts/gik49G9c5LU My client wants to ensure that they are NOT impacted, so should they notify Google anyways? The notice in Webmasters reads: Dear site owner or webmaster of…. We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team
Intermediate & Advanced SEO | | dknewmedia0 -
Managing 404 errors
What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?
Intermediate & Advanced SEO | | SEOProPhoto0