Unsolved Google Search Console Still Reporting Errors After Fixes
-
Hello,
I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site.
I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling.
According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails.
What could be going on here? How can we resolve these error in GSC.
-
Here are some potential explanations and steps you can take to resolve the errors in GSC:
Caching: Sometimes, GSC may still be using cached data and not reflecting the recent changes you made to your website. To ensure you're seeing the most up-to-date information, try clearing your browser cache or using an incognito window to access GSC.
Delayed Processing: It's possible that Google's systems have not yet processed the changes you made to your website. Although Google typically crawls and indexes websites regularly, it can take some time for the changes to be fully reflected in GSC. Patience is key here, and you may need to wait for Google to catch up.
Incorrect Implementation of Redirects: Double-check that the redirects you implemented are correctly set up. Make sure they are functioning as intended and redirecting users and search engines to the appropriate pages. You can use tools like Redirect Checker to verify the redirects.
Check Robots.txt: Ensure that your website's robots.txt file is not blocking Googlebot from accessing the necessary URLs. Verify that the redirected and fixed pages are not disallowed in the robots.txt file.
Verify Correct Domain Property: Ensure that you have selected the correct domain property in GSC that corresponds to the website where you made the changes. It's possible that you might be validating the wrong property, leading to repeated failures.
Inspect URL Tool: Utilize the "Inspect URL" tool in GSC to manually check specific URLs and see how Google is currently processing them. This tool provides information about indexing status, crawling issues, and any potential errors encountered.
Re-validate the Fixes: If you have already submitted the fixes for validation in GSC and they failed, try submitting them again. Sometimes, the validation process can encounter temporary glitches or errors.
If you have taken the appropriate steps and the validation failures persist in GSC, it may be worth reaching out to Google's support team for further assistance. They can help troubleshoot the specific issues you are facing and provide guidance on resolving the errors.
-
Facing the same error with redirects even after the fix on our website https://ecomfist.com/.
-
@tif-swedensky It usaly takes bettwen a week to 3month to show the right results do not worry about that if you fixed good to go
-
Hi! Google Search Console has this issue, I would recommend not to pay much attention to it. If you know that everything's correct on the website, than you don't need to worry just because of Search Console issues.
-
In this case, it's likely that the Google bots may have crawled through your site before you fixed the errors and haven't yet recrawled to detect the changes. To fix this issue, you'll need to invest in premium SEO tools such as Ahrefs or Screaming Frog that can audit your website both before and after you make changes. Once you have them in place, take screenshots of the findings both before and after fixing the issues and send those to your client so they can see the improvements that have been made.
To give you an example, I recently encountered a similar issue while working with a medical billing company named HMS USA LLC. After running some SEO audits and making various fixes, the GSC errors had been cleared. However, it took a few attempts to get it right as the changes weren't detected on the first recrawl.
Hopefully, this information is useful and helps you understand why your GSC issues may still be showing up after being fixed. Good luck!
-
Hi,
We have had the similar problem before. We are an e-commerce company with the brand name VANCARO. As you know the user experience is very important for an e-commerce company. So we are very seriouse about the problems reported by GSC. But sometimes the update of GSC may be delayed. You need to observe a little more time. Or I can share you anoter tool : https://pagespeed.web.dev/. Hope it can help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content homepage - Google canonical 'N/A'?
Hi, I redesigned a clients website and launched it two weeks ago. Since then, I have 301 redirected all old URL's in Google's search results to their counterparts on the new site. However, none of the new pages are appearing in the search results and even the homepage has disappeared. Only old site links are appearing (even though the old website has been taken down ) and in GSC, it's stating that: Page is not indexed: Duplicate, Google chose different canonical than user However, when I try to understand how to fix the issue and see which URL it is claiming to be a duplicate of, it says: Google-selected canonical: N/A It says that the last crawl was only yesterday - how can I possibly fix it without knowing which page it says it's a duplicate of? Is this something that just takes time, or is it permanent? I would understand if it was just Google taking time to crawl the pages and index but it seems to be adamant it's not going to show any of them at all. 55.png
Technical SEO | | goliath910 -
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Backlinks on Moz not on Google Search Console
Moz is showing thousands of backlinks to my site that are not showing up on Google Search Console - which is good because those links were created by some spammer in Pakistan somewhere. I haven't yet submitted a disavow report to Google of well over 10K links because the list keeps growing every day with new backlinks that have been rerouted to a 404 page. I have asked Google to clarify and they put my question on their forum for an answer, which I'm still waiting for - so I thought I'd try my luck here. My question... If Moz does not match Google Search Console, and backlinks are important to results, how valid is the ranking that Moz creates to let me know how I'm doing in this competition and if I'm improving or not. If the goal is to get Google to pay attention and I use Moz to help me figure out how to do this, how can I do that if the backlink information isn't the same - by literally over 10 000 backlinks created by some spammer doing odd things... They've included the url from their deleted profile on my site with 100s of other urls, including Moz.com and are posting them everywhere with their preferred anchor text. Moz ranking considers the thousands of spam backlinks I can't get rid of and Google ignores them or disavows them. So isn't the rankings, data, and graphs apples and bananas? How can I know what my site's strength really is and if I'm improving or not if the data doesn't match? Complete SEO Novice Shannon Peel
Link Building | | MarketAPeel
Brand Storyteller
MarketAPeel0 -
Unsolved Have we been penalised?
Hey Community, We need help! Have we been penalised, or is there some technical SEO issue that is stopping our service pages from being properly read? Website: www.digitalnext.com.au In July 2021, we suffered a huge drop in coverage for both short and longtail keywords. We thought that this could have been because of the link spam, core web vitals or core update around that time period. SEMRush: https://gyazo.com/d85bd2541abd7c5ed2e33edecc62854c
Technical SEO | | StevenLord
GSC: https://gyazo.com/c1d689aff3506d5d4194848e625af6ec There is no manual action within GSC and we have historically ranked page 1 for super competitive keywords. After waiting some time thinking it was an error, we have then have taken the following actions: Launched new website. Rewrote all page content (except blog posts). Ensured each page passes core web vitals. Submitted a backlink detox. Removed a website that was spoofing our old one. Introduced strong pillar and cluster internal link structure. After 3 months of the new website, none of our core terms has come back and we are struggling for visibility. We still rank for some super long-tail keywords but this is the lowest amount of visibility we have had in over 5 years. Every time we launch a blog post it does rank for competitive keywords, yet the old keywords are still completely missing. It almost feels like any URLs that used to rank for core terms are being penalised. So, I am wondering whether this is a penalisation (and what algorithm), or, there is something wrong with the structure of our service pages for them to not rank. Look forward to hearing from you
Steven0 -
Unsolved Almost every new page become Discovered - currently not indexed
Almost every new page that I create becomes Discovered - currently not indexed. It started a couple of months ago, before that all pages were indexed within a couple of weeks. Now there are pages that have not been indexed since the beginning of September. From a technical point of view, the pages are fine and acceptable for a Google bot. The pages are in the sitemap and have content. Basically, these are texts of 1000+ or 2000+ words. I've tried adding new content to pages and even transferring content to a new page with a different url. But in this way, I managed to index only a couple of pages. Has anyone encountered a similar problem?
Product Support | | roadlexx
Could it be that until September of this year, I hadn't added new content to the site for several months?
Please help, I am already losing my heart.0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
Error msg 'Duplicate Page Content', how to fix?
Hey guys, I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content? I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation). Any advice would be very helpful! Cheers, Peter
Technical SEO | | pjuszczynski0