Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Search Console Showing 404 errors for product pages not in sitemap?
-
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url).
Is this expected? Will these errors eventually go away/stop being monitored by Google?
-
@woshea Implement 301 redirects from the old URLs to the new ones. This tells search engines that the old page has permanently moved to a new location. It also ensures that visitors who click on old links are redirected to the correct content.
-
Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
-
@woshea Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
Search Console Missing field 'mainEntity'
Hello,
SEO Tactics | | spaininternship
I am with a problem, in my site I added a faq with schema structure (https://internships-usa.eu/faq/). But is appearing the following problem in Search Console:
Missing field 'mainEntity' ["WebPage","FAQPage"],"@id":"https://internships-usa.eu/faq/#webpage","url":"https://internships-usa.eu/faq/","name":"Help Center - Internships USA","isPartOf":{"@id":"https://internships-usa.eu/#website"},"datePublished":"2022-05-31T14:43:15+00:00","dateModified":"2022-06-01T08:07:13+00:00","breadcrumb":{"@id":"https://internships-usa.eu/faq/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https://internships-usa.eu/faq/"]}]}, What do I have to do to solve this?0 -
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
What do you do with product pages that are no longer used ? Delete/redirect to category/404 etc
We have a store with thousands of active items and thousands of sold items. Each product is unique so only one of each. All products are pinned and pushed online ... and then they sell and we have a product page for a sold item. All products are keyword researched and often can rank well for longtail keywords Would you :- 1. delete the page and let it 404 (we will get thousands) 2. See if the page has a decent PA, incoming links and traffic and if so redirect to a RELEVANT category page ? ~(again there will be thousands) 3. Re use the page for another product - for example a sold ruby ring gets replaces with ta new ruby ring and we use that same page /url for the new item. Gemma
Technical SEO | | acsilver0 -
Cookies disabled pointing to a 404 page
Hi mozzers, I am running an audit and disabled cookies on our homepage for testing purposes, this pointed to a 404 http response? I tried on other pages and they were loading correctly. I assume this is not normal? Why this is happening? and could this harm the site's SEO? Thanks!
Technical SEO | | Taysir0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40