Why the sudden increase in soft 404s?
-
I haven't made any changes to my site but in a week I am showing 30-40 soft 404s in Webmaster Tools. This just started happening in the last 2 weeks. When I click to go to the pages they are fine, and even fetch and render works fine on the pages.
-
They have stopped, with no changes to the site. I have no idea why. Thank you for the offer though.
-
Hi EcommerceSite!
Would love to help you figure this out - please PM the URL. Thanks!
-
It seems to have stopped with no changes to the site. I have no idea why.
-
Hi EcommerceSite,
Can't see an answer to your question so far, are you still having issues?
If you want to send me a pm with your url then I'll have a look at this for you.
Tom
-
I can send it in a message.
-
Hi EcommerceSite!
It really sounds like folks will need to check out your site, or at least have a lot more information, in order to give much more advice. Is that something you can share?
-
Loading times have stayed really stable.
There are no 404 errors in either tool.
Using the fetch as Googlebot tool the pages all work fine.
It doesn't make any sense.
-
Hi,
This can occur if Google's crawlers for some reason is not able to reach some of your pages. It could be related to some network issues, or temporary server issues. A few things to look at:
- Do you see any increased loading times for your pages?
- When looking at the page with firebug or chrome's inspector tools - do you see any 404 errors returned
- What result do you get if using the Fetch as Googlebot tool?
Hope this helps
Best regards,Anders
-
It'd be great if you can share what your site is so people can check it out and see if they can figure out what's going on. Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I increase my website DA and PA?
I want to launch a new site like www.plaza.ir. How can I improve its Da and PA quickly ?
Intermediate & Advanced SEO | | arpaymantul0 -
How to increase Page authority of old blog posts
Hi, How can I increase the page authority of old blog posts? There are many posts that are ranking well (Page 1 lower or Page 2) - but I want to make them rank higher by making the post more usable, better UI, design, content relaunches etc - these all would inherently mean improving Page authority also eventually. What are some concrete steps I can take to improve page authority of blog pages?
Intermediate & Advanced SEO | | pks3331 -
Increase in duplicate page titles due to canonical tag issue
Implemented canonical tag (months back) in product pages to avoid duplicate content issue. But Google picks up the URL variations and increases duplicate page title errors in Search Console. Original URL: www.example.com/first-product-name-123456 Canonical tag: Variation 1: www.example.com/first-product--name-123456 Canonical tag: Variation 2: www.example.com/first-product-name-sync-123456 Canonical tag: Kindly advice the right solution to fix the issue.
Intermediate & Advanced SEO | | SDdigital0 -
Hacked Wordpress Site! So many 404s
So I had a site that I worked on get hacked. We eliminated the URLs, found the vulnerability (Bluehost!) and rolled back the site. BUT they got into the Google Search Console and indexed a LOT of pages. These pages are now 404 errors and I asked the robots.txt file to make them noindex. The problem is that Google is placing a "this site may be hacked" on the search listing. I asked Google to reevaluate it and it was approved by there are still 80,000 404 errors being shown and it still believes that the uploaded files that we deleted should be showing. Doing a site search STILL shows the infected pages though and it has been a month. Any insight would definitely be helpful. Thanks!
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
Should you increase the caching levels in Cloudfare to speed up the load times?
Caching Level Determine how much of your website's static content you want CloudFlare to cache. Increased caching can speed up page load time.Caching Level Ignore the query string of static content Site: http://www.southernwhitewater.com Determine how much of your website's static content you want CloudFlare to cache. Increased caching can speed up page load time.
Intermediate & Advanced SEO | | VelocityWebsites0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0 -
Can the template increase the loading time of the site?
Hi, My site was built with WordPress. Very recently I had it redesigned. The problem is that now it takes a long time to download. I have spoken with a web designer who checked my site and said that after it was rebuilt, the template that was created included a lot of hard coding. Can this be the reason why my site now takes a long time to load? The hard coding factor? Thank you for your help. Sal P.S.: FYI the site only has a few plug-ins and the server is a good one.
Intermediate & Advanced SEO | | salvyy0