Why the sudden increase in soft 404s?
-
I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.
-
They have stopped, with no changes to the site. I have no idea why. Thank you for the offer though.
-
Hi EcommerceSite!
Would love to help you figure this out - please PM the URL. Thanks!
-
It seems to have stopped with no changes to the site. I have no idea why.
-
Hi EcommerceSite,
Can't see an answer to your question so far, are you still having issues?
If you want to send me a pm with your url then I'll have a look at this for you.
Tom
-
I can send it in a message.
-
Hi EcommerceSite!
It really sounds like folks will need to check out your site, or at least have a lot more information, in order to give much more advice. Is that something you can share?
-
Loading times have stayed really stable.
There are no 404 errors in either tool.
Using the fetch as Googlebot tool the pages all work fine.
It doesn't make any sense.
-
Hi,
This can occur if Google's crawlers for some reason is not able to reach some of your pages. It could be related to some network issues, or temporary server issues. A few things to look at:
- Do you see any increased loading times for your pages?
- When looking at the page with firebug or chrome's inspector tools - do you see any 404 errors returned
- What result do you get if using the Fetch as Googlebot tool?
Hope this helps
Best regards,Anders
-
It'd be great if you can share what your site is so people can check it out and see if they can figure out what's going on. Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Canonical Tags increased after putting the appropriate tag?
Hey, I noticed that the number of duplicate title tags increased from 14k to 30k in Google Search Console. These dup title tags derived from having the incorrect canonical tags. For instance, http://www.site.com/product-name/product-code/?d=Mens
Intermediate & Advanced SEO | | ggpaul562
http://www.site.com/product-name/product-code/?d=Womens These two are the same exact pages with two parameters (These are not unisex by the way). Anyway, when I viewed the page source, it had the parameter in the canonical tag so.... it would look like this So whether it be http://www.site.com/product-name/product-code/
http://www.site.com/product-name/product-code/?d=Mens
http://www.site.com/product-name/product-code/?d=Womens The canonical tag had the "?d=Womens" I figured that wasn't best practices, so for the canonical tag I removed the parameter so now the canonical tag is http://www.site.com/product-name/product-code/ for that specific page with parameter (if that makes sense). My question is, why did my number of errors doubled after what I thought fixed the solution?0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Hacked Wordpress Site! So many 404s
So I had a site that I worked on get hacked. We eliminated the URLs, found the vulnerability (Bluehost!) and rolled back the site. BUT they got into the Google Search Console and indexed a LOT of pages. These pages are now 404 errors and I asked the robots.txt file to make them noindex. The problem is that Google is placing a "this site may be hacked" on the search listing. I asked Google to reevaluate it and it was approved by there are still 80,000 404 errors being shown and it still believes that the uploaded files that we deleted should be showing. Doing a site search STILL shows the infected pages though and it has been a month. Any insight would definitely be helpful. Thanks!
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Wondering why PR hasn't increased?
Hi there, I’ve been working on a website for about 6 months now and the page rank still remains at 0 - Google Page Rank. Fresh content has been created across the majority of the site, blog implemented, titles and meta’s, schema.org, we've built some good links etc. There are a lot of 404’errors but a lot of this is to do with stocking issues, products being sold/taken down and new products being put up. Do you think this is the major reason the page rank is not moving – but 404’s are a regular occurrence on a lot of E-Commerce sites. Also, the server went off line on two occasions(obviously Google frowns upon this) but in general server is grand. Also when we started working on the website it wasn't in the best of shape DA: 11, now it's DA:17. I know still not great but moving in the right direction. Just wondering yer thoughts on the PR?
Intermediate & Advanced SEO | | niamhomahony0 -
How to find 20 hidden 404s
Hello, We have like twenty 404s left to find. How do you find these when: 1. They don't show up in Google Webmaster Tools 2. They don't have any other internal or external pages linking to them. 3. They don't show up in site:domain.com (We have 9000 pages and only 600 show up - I fixed those out of the 600). 4. They are probably causing high bounce rates. 5. They're not in the sitemap Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Why the sudden link drop?
A the end of November I am showing that our total links were 118k. Current links are 22k. We changed sites early November so that was about three weeks before. What would cause the drop of about 100k links? Or where should I start investigating?
Intermediate & Advanced SEO | | EcommerceSite0