Are these Search Console crawl errors a major concern to new client site?
-
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors:
- 111 Server Errors (photos)
- 104 Soft 404s (blogs, archives, tags)
- 6,229 Not Found (listings)
I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog.
For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me.
When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now.
I'd appreciate your insights Mozzers!
-
The Soft 404s are probably because the archive and/or tag pages that they are crawling are predominantly empty and look like a 404'd page that is returning a 200. If google is already indexing the actual articles/blog posts then you can most likely safely NoIndex the archive pages and tag pages. Many of those pages exist for the visitor but wind up creating other problems like duplicate content issues, soft 404, and so on.
Anything that is a legitimate 404 but is still coming up as a Soft 404, you should make sure your backend is serving the 404 response code properly or not as there may be an issue there. Other legitimate 404s that are serving the proper 404 reponse (not soft 404) are fine and can be marked fixed.
For those "Not Found" previous listing, you need to determine what (if anything) should be 301'd to an existing page so as to not lose link equity and then determine what is gone forever to serve a 410 response on (or leave them as 404s and they'll drop off eventually).
-
Hi there.
See the date when those errors were discovered in Search console. If they are recent, it means that the missing pages are indeed crawled and return 404, which is not good, however you look at it. So, yes, i'd recommend to redirect those to existing pages. Usually crawl errors don't have direct effect on rankings, but it's always nice to fix them. So, if you are sure that all of those 404s are "fixed" now or not supposed to be there, spend an hour fixing the top priority ones, don't die over it.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing very few pages analysed re: Mobile usability, in Google Seach Console - why?
Hi Mozzers, Under Mobile Usability, in Google Search Console, I am seeing very few website pages getting analysed - 10 out of 40 static pages, on the website in question. Is this to be expected or does this indicated an indexing problem on mobile?
Reporting & Analytics | | McTaggart0 -
Google Search Console
To the Moz Community, Should we be considering the information that Google Search Console is telling us? It is showing a dramatic drop in our SEO and our pages are not being indexed, however it is showing differently in our Moz Analytics section. Any clarification will be greatly appreciated. Many thanks Dawn
Reporting & Analytics | | DawnQ0 -
Search my keyword on google
When I search my keyword on google I can see my website, but when I connect to a VPN and again search it, I'm not in the search result, what happens on my website?
Reporting & Analytics | | Pintapin0 -
When analysing my inbound anchor text am I using by page or site?
When checking to see my anchor text profile to make sure it's not too dense with the same phrases, should I be measuring against my whole site or page specific? eg if i have 100 links across my site and 20 are for the same phrase this is 20%, but if the same 20 phrases are to one page and that page has 40 links this is 55% Many thanks Ash
Reporting & Analytics | | AshShep10 -
GATC New code structure?
Hi Guys, One of my clients has this code showing up, and it is receiving data! Is this normal? First time I ever see such a code! Thanks for clarifying!
Reporting & Analytics | | Ideas-Money-Art0 -
Has anyone noticed a dramatic drop in direct visits year over year in GA across multiple sites?
I monitor about 10 websites in GA. Many of these sites are in a stable phase of their lifecycle. I've noticed this year that direct visits on all my sites and even friends sites have dropped by 20-60%. Has anyone seen any explanation for this or noticed this when compared to previous year? In every instance I have no penalties, notices, anything and the drop is made up completely of "direct visits".
Reporting & Analytics | | bradwayland0 -
Google Analytics Best Practice Set up for Clients
Hi When setting up new Google Analytics accounts for clinets what is the preferred/best practice. At present we have our own company google account and add new clinets this way (to our account) - the disadvantage with this, we can only grant them limited account access otherwise they would be able to view all the accounts we cretaed. Plus we can't link their adwords to the GA account we cretaed them. Is it best practice to set the client up with their own Google Account and then we just link to their account. Advise would be appreciated, thank you.
Reporting & Analytics | | daracreative0 -
Setting up Analytics on a Site that Uses Frames For Some Content
I work with a real estate agent and he uses strings from another tool to populate the listings on his site. In an attempt to be able to track traffic to both the framed pages and the non-framed pages he has two sets of analytics code on his site - one inside the frame and one for the regular part of the site. (there's also a third that the company who hosts his site and provides all these other tools put on his site - but I don't think that's really important to this conversation). Not only is it confusing looking at the analytics data, his bounce rate is down right unmanageable. As soon as anyone clicks on any of the listings they've bounced away. Here's a page - all of those listings below " Here are the most recent Toronto Beaches Real Estate Listings" are part of a frame. http://eastendtorontohomes.com/toronto-beach-real-estate-search/ I'm not really sure what to do about it or how to deal with it? Anyone out there got any good advice? And just in case you're wondering there aren't any other options - apart from spending thousands to build his own database thingie. We've thought about that (as other agents in the city have done that), but just aren't sure it's worth it. And, quite frankly he doesn't want to spend the money.
Reporting & Analytics | | annasus0