404s clinging on in Search Console
-
What is a reasonable length of time to expect 404s to be resolved in Search Console? There was a mass of 404s that were built up from directory changes and filtering URLs that have been fixed. These have all been fixed but of course there are some that slipped the net. How long is it reasonable to expect the old 404s that don't have any links to drop away from Search Console? New 404s are still being reported over 4 months later. 'First detected' is always showing as a date later than the fixed 404's date.
Is this reasonable, i've never seen this being so resilient and not clean up like this? We manually fix these 404s and like popcorn more turn up.
Just to add the bulk of 404s came into existence around a year ago and left for around 8 months.
-
Hey Michael,
I've found some other threads about the same issue. You can check them, maybe it could help:
https://moz.com/community/q/how-long-will-404-errors-show-in-webmaster-tools-search-console https://productforums.google.com/forum/#!topic/webmasters/hupstYVfBzoBest, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Operators Acting Strange
Hi Mozers, I'm using search operators for a count of how many pages have been indexed for each section of the site. I was able to download the first 1000 pages from Google Search Console but there are more than 1000 pages indexed, so I'm using operators for a count (even if I can't get the complete list of indexed URLs). [Although, if there is a better way, PLEASE let me know!] Anyway, in terms of search operators: from my understanding, the more general the URL, the more results should come up. However, when I put in the domain site:www.XXX it gives me FEWER results than when I put in site:www.XXX/. When I add the backslash to the end of the domain, it gives me MORE results. And when I put in site:www.AAA/BBB/CC it gives me MORE results than when I put in site:www.AAA/BBB. What's with this? Yael
Intermediate & Advanced SEO | | yaelslater1 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Best way to remove low quality paginated search pages
I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed. I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Does putting a Google custom search box on make Google think my users are bouncing?
I added a Google custom search box to my pages, that's doing an advanced Google search. A lot of people are using it. So users are coming to my site from a Google search, and then often performing another Google search on my site. Should I be worried that Google may interpret the resultant user behavior as a bounce or pogo-stick? Or will the fact that the second search occurred on my site, using custom search, and with advanced parameters signal to Google that this is not a dissatisfied user returning to Google? Thanks
Intermediate & Advanced SEO | | GilReich0 -
Are prices shown in search results good for e-commerce sites?
Hello here. I own an e-commerce website (virtualsheetmusic.com) and with the fact we have implemented structured data for our product pages, now our search results on Google appear with pricing information whereas most of our competitors don't have that information displayed (yet). I am wondering: Do you think is that good? What side effects could that cause? Less CTR? Less bounce rate? Less traffic? Any thoughts on this issue are very welcome. Thanks!
Intermediate & Advanced SEO | | fablau0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
How to get your company on the Google +, Right Hand side box of search results?
http://www.searchenginejournal.com/google-plus-content-replaces-ads/41452/ We have a Google plus page, but the results aren't coming up there Do you need a certain amount of people in your circles, what is the criteria to get your brand here? Any links?
Intermediate & Advanced SEO | | xoffie0 -
Search results all going to home page
I'm an author, and after doing a search for one of my books I realized that no matter what was searched, the user was getting lead to the homepage. please see the attached picture. How do I fix this and is this hurting my SEO? Capture.JPG Capture1.JPG
Intermediate & Advanced SEO | | StreetwiseReports0