Are these Search Console crawl errors a major concern to new client site?
-
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors:
- 111 Server Errors (photos)
- 104 Soft 404s (blogs, archives, tags)
- 6,229 Not Found (listings)
I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog.
For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me.
When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now.
I'd appreciate your insights Mozzers!
-
The Soft 404s are probably because the archive and/or tag pages that they are crawling are predominantly empty and look like a 404'd page that is returning a 200. If google is already indexing the actual articles/blog posts then you can most likely safely NoIndex the archive pages and tag pages. Many of those pages exist for the visitor but wind up creating other problems like duplicate content issues, soft 404, and so on.
Anything that is a legitimate 404 but is still coming up as a Soft 404, you should make sure your backend is serving the 404 response code properly or not as there may be an issue there. Other legitimate 404s that are serving the proper 404 reponse (not soft 404) are fine and can be marked fixed.
For those "Not Found" previous listing, you need to determine what (if anything) should be 301'd to an existing page so as to not lose link equity and then determine what is gone forever to serve a 410 response on (or leave them as 404s and they'll drop off eventually).
-
Hi there.
See the date when those errors were discovered in Search console. If they are recent, it means that the missing pages are indeed crawled and return 404, which is not good, however you look at it. So, yes, i'd recommend to redirect those to existing pages. Usually crawl errors don't have direct effect on rankings, but it's always nice to fix them. So, if you are sure that all of those 404s are "fixed" now or not supposed to be there, spend an hour fixing the top priority ones, don't die over it.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing very few pages analysed re: Mobile usability, in Google Seach Console - why?
Hi Mozzers, Under Mobile Usability, in Google Search Console, I am seeing very few website pages getting analysed - 10 out of 40 static pages, on the website in question. Is this to be expected or does this indicated an indexing problem on mobile?
Reporting & Analytics | | McTaggart0 -
Time For New Website To Rank?
We've been working on a site for the past couple months and going to be launching in a couple weeks. How long does it take for Google to establish a "stabilized" ranking for the site and various pages? The main homepage will have its targeted keywords and each product page will have its own targeted keywords. We have about 100 total pages for the site. I know rankings may fluctuate initially but trying to get an idea of when the rankings are stabilized before we start working on the ongoing SEO like earning backlinks, changing on-page optimization, etc to be able to track changes in rankings over time.
Reporting & Analytics | | vikasnwu0 -
SEO dealing with a CDN on a site.
This one is stumping me and I need some help. I have a client who's site is www.site.com and we have set them up a CDN through Max CDN at cdn.site.com which is basically a cname to the www.site.com site. The images in the GWT for www.site.com are de-indexing rapidly and the images on cdn.site.com are not indexing. In the Max CDN account I have the images from cdn.site.com sending a canonical header from www.site.com but that does not seem to help, they are all still de-indexing.
Reporting & Analytics | | LesleyPaone0 -
Any harm and why the differences - multiple versions of same site in WMT
In Google Webmaster Tools we have set up: ourdomain.co.nz
Reporting & Analytics | | zingseo
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.au As you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says: "If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site." The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say: "Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead." This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au). However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead: Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/. This is a problem as it means that we can't delete these profiles from our WMT account. Any thoughts on the above would be welcome. As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain.... Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...0 -
403 error-How to fix it?
http://muslim-academy.com/ Got 36 "403 errors". Google some stuff and also look into SEOMOZ nothing relevant. The site is wordpress latest version and host is Godaddy. I have recently added these URL's in robots.txt file and they were removed but because of some issue in robots.txt file I have to revert it and make it blank. Kindly guide me a permanent remedy for it?
Reporting & Analytics | | csfarnsworth0 -
Why would a website rank lower than weaker site?
Hi, Today I noticed that my website is ranking one place lower than a competitor in Google UK ,despite my site having a stronger domain authority and page authority. Is there a plausible reason for this, i'm slightly confused? Thanks,
Reporting & Analytics | | Benjamin3790 -
500 errors and impact on google rankings
Since the launch of our newly designed website about 6 months ago, we are experiencing a high number of 500 server errors (>2000). Attempts to resolve these errors have been unsuccessful to date. We have just started to notice a consistent and sustained drop in rankings despite our hard sought efforts to correct. Two questions... can very high levels of 500 errors adversely effect our google rankings? And, if this is the case, what type of specialist (what are they called) has expertise to investigate and fix this issue. I should also mention that the sitemap also goes down on a regular basis, which some have stated is due to the size of the site (>500 pages). Don't know if they're part of the same problem? Thanks.
Reporting & Analytics | | ahw0 -
Can you get local search numbers/traffic out of Google Analytics?
With Google's new local search I am more curious as to market penetration on keywords that are now localized to my different US cities. I understand that you can separate out Google traffic based on regional Google domains, but I am curious if there is an effective way to separate out searches and keywords based on a my local US Metros? If google cannot do this, any recommendations on products that can? Thanks.
Reporting & Analytics | | Thos0031