Google Search Console Showing 404 errors for product pages not in sitemap?
-
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url).
Is this expected? Will these errors eventually go away/stop being monitored by Google?
-
@woshea Implement 301 redirects from the old URLs to the new ones. This tells search engines that the old page has permanently moved to a new location. It also ensures that visitors who click on old links are redirected to the correct content.
-
Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
-
@woshea Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console. Here's what I think we may have done wrong. Let me know what you think... We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.) We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.) We didn't use rel canonicals to tell search engines that these pages were special made for the areas. We didn't use alt tags until about halfway through. A lot of the urls don't use the target keyword exactly. The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages. We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that) We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it? I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair". Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ I appreciate your time and concern. Have a great weekend!
Local SEO | | everysecond0 -
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Search Console Missing field 'mainEntity'
Hello,
SEO Tactics | | spaininternship
I am with a problem, in my site I added a faq with schema structure (https://internships-usa.eu/faq/). But is appearing the following problem in Search Console:
Missing field 'mainEntity' ["WebPage","FAQPage"],"@id":"https://internships-usa.eu/faq/#webpage","url":"https://internships-usa.eu/faq/","name":"Help Center - Internships USA","isPartOf":{"@id":"https://internships-usa.eu/#website"},"datePublished":"2022-05-31T14:43:15+00:00","dateModified":"2022-06-01T08:07:13+00:00","breadcrumb":{"@id":"https://internships-usa.eu/faq/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https://internships-usa.eu/faq/"]}]}, What do I have to do to solve this?0 -
Unsolved /%25s
Hi Community, has anyone else had a 404 error reported by Moz, where the end of the domain is /%25s? The error comes from my blog home page https://kaydee.net/blog/ But when I look at the source code, I can't see anything that has a space at the end of the URL. I wonder if it is to do with the WordPress search? Thanks in advance for any insight.
Moz Pro | | kaydeeweb0 -
"Moz encountered an error on one or more pages on your site" Error
I have been receiving this error for a while: "Moz encountered an error on one or more pages on your site" It's a Multi-Lingual Wordpress website, the robots.txt is set to allow crawlers on all links and I have followed the same process for other website I've done yet I'm receiving this error for this site.
Technical SEO | | JustinZimri0 -
Google webmaster errors
**If you know what these google webmasters errors mean, and you can explain it to me in simple english and tell me how I can locate the problem, I would really appreciate it!. <colgroup><col width=""><col width=""><col width=""><col width=""><col width="*"><col width="124"><col width="54"></colgroup>
Technical SEO | | Joseph-Green-SEO
| | | | | Server error | | | | Soft 404 | | | | Access denied | | Not found | | | Not followed | | | |** I have many of these errors, is it harming SEO?Yoseph0 -
Notice of DMCA removal from Google Search
Dear Mozer's Today I get from Google Webmaster tools a "Notice of DMCA removal" I'll paste here the note to get your opinions "Hello, Google has been notified, according to the terms of the Digital Millennium Copyright Act (DMCA), that some of your materials allegedly infringe upon the copyrights of others. The URLs of the allegedly infringing materials may be found at the end of this message. The affected URLs are listed below: http://www.freesharewaredepot.com/productpages/Ultimate_Spelling__038119.asp" So I perform these steps: 1. Remove the page from the site (now it gives 404). 2. Remove it from database (no listed on directory, sitemap.xml and RSS) 3. I fill the "Google Content Removed Notification form" detailing the removal of the page. My question is now I have to do any other task, such as fill a site reconsideration, or only I have to wait. Thank you for your help. Claudio
Technical SEO | | SharewarePros0