Google search console: 404 and soft 404 without any back-links. Redirect needed?
-
Hi Moz community,
We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed?
Thanks
-
Hey vtmoz–I'd recommend you resolve the issue _at the source. _Meaning, fix the broken links at their link location (where they're being linked to). You could 301 these 404s to another location on your site, but then you'll just have a bunch of expired, internal links that 301. If these 404 links don't have backlinks, as you say, then there's no page authority to keep.
You can find this in GSC (webmaster tools). LMK if you have any questions.
-
Hi vtmoz,
I recommend you to redirect all that 404 errors. In my opinion a clean web without errors performs better on Google.
If they're too much, you can install a plugin like:
https://es.wordpress.org/plugins/all-404-redirect-to-homepage/
https://es.wordpress.org/plugins/404-redirection/
https://es.wordpress.org/plugins/redirect-to-404/
Hope that helps.
Greetings!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Big hit taken on Google Search in Jan - Any Ideas?
Hello, I manage a news site that gets new items posted daily. We had had a pretty even keel with Google search and ranking for some time now only on the 9th Jan we took a massive drop and have no recovered except for one big spike on the 29th January. The only think we had done differently was not post as much over Christmas for about a week as people were on holiday but if this was the reason for it the posting is back to normal now and has been since the 6th Jan and nothing has recovered. The site is wjlondon.com - any ideas greatly appreciated. Thank you
Algorithm Updates | | luwhosjack0 -
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
Fetch as Google in GWT - Functionality
Hi, For example, some of the HTML improvements notices from GWT, says dupe meta descriptions or titles, for pages that have since been 301 redirected or had a canonical tag added. So, my idea is to force google to read it using "Fetch as Google" - hoping that it will now see 301 redirection or the fix we have implemented. Does this work? How long does it take? Lastly, should I just click the "fetch as google" or should I also click on the "Submit to index" button? Thanks!
Algorithm Updates | | bjs20100 -
Changes in Google "Site:" Search Algorithm Over Time?
I was wondering if anyone has noticed changes in how Google returns 'site:' searches over the past few years or months. I remember being able to do a search such as "site:example.com" and Google would return a list of webpages where the order may have shown the higher page rank pages (due to link building, etc) first and/or parent category pages higher up in the list of the first page (if relevant) first (as they could have higher PR naturally, anyways). It seems that these days I can hardly find quality / target pages that have higher page rank on the first page of Google's site: search results. Is this just me... or has Google perhaps purposely scrambled the SERPS somewhat for site: searches to not give away their page ranking secrets?
Algorithm Updates | | OrionGroup1 -
Sitewide links - nofollow or delete?
Hi, I've just come across a website with 3 sitewide links. I was thinking of adding nofollow but then thought, well, is that the best option? From an SEO perspective is it better to actually delete those links? Thanks in advance for your feedback.
Algorithm Updates | | McTaggart0 -
FLASH vs HTML links in SEO
In terms of a small flash slideshow and having text and links on various slides within, is such text and links as easily index-able (or even at all) compared to static html text on a webpage?
Algorithm Updates | | heritageseo0