Unsolved Google Search Console Still Reporting Errors After Fixes
-
Hello,
I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site.
I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling.
According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails.
What could be going on here? How can we resolve these error in GSC.
-
Here are some potential explanations and steps you can take to resolve the errors in GSC:
Caching: Sometimes, GSC may still be using cached data and not reflecting the recent changes you made to your website. To ensure you're seeing the most up-to-date information, try clearing your browser cache or using an incognito window to access GSC.
Delayed Processing: It's possible that Google's systems have not yet processed the changes you made to your website. Although Google typically crawls and indexes websites regularly, it can take some time for the changes to be fully reflected in GSC. Patience is key here, and you may need to wait for Google to catch up.
Incorrect Implementation of Redirects: Double-check that the redirects you implemented are correctly set up. Make sure they are functioning as intended and redirecting users and search engines to the appropriate pages. You can use tools like Redirect Checker to verify the redirects.
Check Robots.txt: Ensure that your website's robots.txt file is not blocking Googlebot from accessing the necessary URLs. Verify that the redirected and fixed pages are not disallowed in the robots.txt file.
Verify Correct Domain Property: Ensure that you have selected the correct domain property in GSC that corresponds to the website where you made the changes. It's possible that you might be validating the wrong property, leading to repeated failures.
Inspect URL Tool: Utilize the "Inspect URL" tool in GSC to manually check specific URLs and see how Google is currently processing them. This tool provides information about indexing status, crawling issues, and any potential errors encountered.
Re-validate the Fixes: If you have already submitted the fixes for validation in GSC and they failed, try submitting them again. Sometimes, the validation process can encounter temporary glitches or errors.
If you have taken the appropriate steps and the validation failures persist in GSC, it may be worth reaching out to Google's support team for further assistance. They can help troubleshoot the specific issues you are facing and provide guidance on resolving the errors.
-
Facing the same error with redirects even after the fix on our website https://ecomfist.com/.
-
@tif-swedensky It usaly takes bettwen a week to 3month to show the right results do not worry about that if you fixed good to go
-
Hi! Google Search Console has this issue, I would recommend not to pay much attention to it. If you know that everything's correct on the website, than you don't need to worry just because of Search Console issues.
-
In this case, it's likely that the Google bots may have crawled through your site before you fixed the errors and haven't yet recrawled to detect the changes. To fix this issue, you'll need to invest in premium SEO tools such as Ahrefs or Screaming Frog that can audit your website both before and after you make changes. Once you have them in place, take screenshots of the findings both before and after fixing the issues and send those to your client so they can see the improvements that have been made.
To give you an example, I recently encountered a similar issue while working with a medical billing company named HMS USA LLC. After running some SEO audits and making various fixes, the GSC errors had been cleared. However, it took a few attempts to get it right as the changes weren't detected on the first recrawl.
Hopefully, this information is useful and helps you understand why your GSC issues may still be showing up after being fixed. Good luck!
-
Hi,
We have had the similar problem before. We are an e-commerce company with the brand name VANCARO. As you know the user experience is very important for an e-commerce company. So we are very seriouse about the problems reported by GSC. But sometimes the update of GSC may be delayed. You need to observe a little more time. Or I can share you anoter tool : https://pagespeed.web.dev/. Hope it can help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC problem: how to solve?
Hi all, Google Search Console gives me an error on these pages: info:https://www.varamedia.be/?utm_content=bufferbaaa4&utm_medium=social&utm_source=plus.google.com&utm_campaign=buffer info:https://www.varamedia.be/?utm_content=bufferece3f&utm_medium=social&utm_source=plus.google.com&utm_campaign=buffer I see there's an UTM tracking in the URL from Google+. We do have an account there but I don't see how this might give an error. Is this hurting our ranking score? How can we solve this?
Reporting & Analytics | | Varamedia0 -
Strange - Search Console page indexing "../Detected" as 404
Anyone seen this lately? All of a sudden Google Search Console is insisting in Page indexing that there is a 404 for a page that has never existed on our client's site: https://........com.au/Detected We've noticed this across a number of sites, precisely in this way with a capitalised "/Detected" To me it looks like something spammy is being submitted to the SERPs (somehow) and Google is trying to index that and then getting a 404. Naturally MOZ isn't picking it up, cause the page simply never existed - it's just happening in Search Console 2afc7e35-71e4-4e25-80a3-690bf10776a7.png It comes and it goes in the 404 alerts in Console and is really annoying. I reckon it started happening late 2022.
Reporting & Analytics | | DanielDL0 -
Backlinks on Moz not on Google Search Console
Moz is showing thousands of backlinks to my site that are not showing up on Google Search Console - which is good because those links were created by some spammer in Pakistan somewhere. I haven't yet submitted a disavow report to Google of well over 10K links because the list keeps growing every day with new backlinks that have been rerouted to a 404 page. I have asked Google to clarify and they put my question on their forum for an answer, which I'm still waiting for - so I thought I'd try my luck here. My question... If Moz does not match Google Search Console, and backlinks are important to results, how valid is the ranking that Moz creates to let me know how I'm doing in this competition and if I'm improving or not. If the goal is to get Google to pay attention and I use Moz to help me figure out how to do this, how can I do that if the backlink information isn't the same - by literally over 10 000 backlinks created by some spammer doing odd things... They've included the url from their deleted profile on my site with 100s of other urls, including Moz.com and are posting them everywhere with their preferred anchor text. Moz ranking considers the thousands of spam backlinks I can't get rid of and Google ignores them or disavows them. So isn't the rankings, data, and graphs apples and bananas? How can I know what my site's strength really is and if I'm improving or not if the data doesn't match? Complete SEO Novice Shannon Peel
Link Building | | MarketAPeel
Brand Storyteller
MarketAPeel0 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
I submitted Sitemaps from AIO SEO to google search console, if I now delete the AIO plugin, do my sitemaps become invalid?
I use Yoast as SEO for my new Wordpress website https://www.satisfiedshoes.com/, however I couldn't get the sitemaps with Yoast as it was giving me error 404, and regardless of what I tried, it wasn't working. So I then got the All In One SEO while still having Yoast installed, I easily got the AIO sitemaps and then submitted them successfully to the Google search console. My question is that now I got the sitemaps on Google, since I'd rather use Yoast, If I want to delete AIO, will the sidemaps given to Google become invalid? There is no point keeping both SEO plugins active right? Thank You
Technical SEO | | iamzain160 -
WMT "Index Status" vs Google search site:mydomain.com
Hi - I'm working for a client with a manual penalty. In their WMT account they have 2 pages indexed.If I search for "site:myclientsdomain.com" I get 175 results which is about right. I'm not sure what to make of the 2 indexed pages - any thoughts would be very appreciated. google-1.png google-2.png
Technical SEO | | JohnBolyard0 -
How to fix Google index after fixing site infected with malware.
Hi All Upgraded a Joomla site for a customer a couple of months ago that was infected with malware (it wasn't flagged as infected by google). Site is fine now but still noticing search queries for "cheap adobe" etc with links to http://domain.com/index.php?vc=201&Cheap_Adobe_Acrobat_xi in web master tools (about 50 in total). These url's redirect back to home page and seem to be remaining in the index (I think Joomla is doing this automatically) Firstly, what sort of effect would these be having on on their rankings? Would they be seen by google as duplicate content for the homepage (moz doesn't report them as such as there are no internal links). Secondly what's my best plan of attack to fix them. Should I setup 404's for them and then submit them to google? Will resubmitting the site to the index fix things? Would appreciate any advice or suggestions on the ramifications of this and how I should fix it. Regards, Ian
Technical SEO | | iragless0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0