Google Search Console Crawl Errors?
-
We are using Google Search Console to monitor Crawl Errors. It seems Google is listing errors that are not actual errors. For instance, it shows this as "Not found":
https://tapgoods.com/products/tapgoods__8_ft_plastic_tables_11_available
So the page does not exist, but we cannot find any pages linking to it. It has a tab that shows Linked From, but if I look at the source of those pages, the link is not there. In this case, it is showing the front page (listed twice, both for http and https). Also, one of the pages it shows as linking to the non-existant page above is a non-existant page.
We marked all the errors as fixed last week and then this week they came up again. 2/3 are the same pages we marked as fixed last week.
Is this an issue with Google Search Console? Are we getting penalized for a non existant issue?
-
Agreed with Chris, when you have a lot of pages and when your code is a little bit more complex then some basic stuff Google Search Console will have a habit of sending. What I saw in the past as well is that they pick up parts of your tracking code and try to find URL structures within the code that don't really exist but are part of it.
Nothing to really worry about, if you make sure you run a monthly or quarterly crawl to check upon weird URL structures on your site and these URLs don't pop-up there you should be fine. As mentioned, just mark them as fixed so the real issues will move up again.
-
Hello,
You are not being penalized for these crawl errors, but it's important to monitor these. Continue to mark them as fixed and double check to make sure there are none that are broken. Many people have encountered the same issue you are mentioning, it seems to be inaccuracies within Google. Another option is to 301 these 'fake' URLs, however this may be time consuming for you. Also I would double check your sitemap, and make sure the links are not in the sitemap.
Hope this helps.
Chris
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Docs
Hi Mozers, I was wondering what do you guys think about indexing Google Docs files as Documents or Spreadsheets? Can you do that and is it any help if you what to get some content on the firs page of Google. And also can Google see that content and links, because when I deactivate the javascript on chrome I couldn't see anything from the content Thanks
Intermediate & Advanced SEO | | VeeamSoftware0 -
Google Search Console
abc.com www.com http://abc.com http://www.abc.com https://abc.com https://www.abc.com _ your question in detail. The more information you give, the better! It helps give context for a great answer._
Intermediate & Advanced SEO | | brianvest0 -
Unnatural Links Warning Disappeared from Search Console Account
Hello all, In 2013 I had an Unnatural Links Warning message in my GWT account. I believe that it was a result of the work of an old SEO company. When I received the warning I was working with an SEO. He helped me clean up some links. He also uploaded a disavow file for me. He did not file a reconsideration request. He told me that it was not necessary at the time. The message disappeared from my account. A few months ago a similar message appeared in the manual accounts section of my account. I gathered inbound links from GWT, Majestic, etc. I went through them myself and tried to contact lots and lots of webmasters. I got many links cleaned up. I spent several months on this project. I just logged into my Search Console account this afternoon and clicked through everything and guess what... that manual penalty message is gone. So... what does that mean? I assume that I should still upload the disavow file for the sites that did not respond to me that are spammy. Should I still try to file a reconsideration request even though there doesn't seem to be a manual penalty? How should I proceed? Thanks. Melissa
Intermediate & Advanced SEO | | pajamalady0 -
Crawl diagnostic issue?
I'am sorry if my English isn't very good, but this is my problem at the moment: On two of my campagnes I get a weird error on Moz Analytics: 605 Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag Moz Analytics points to an url that starts with: http:/**/None/**www.????.com. We don't understand how Moz indexed this non-existing page that starts with None? And how can we solve this error? I hope that someone can help me.
Intermediate & Advanced SEO | | nettt0 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Does having all client websites on same server/same Google Analytics red flag Google?
If you have several clients, and they are all on the same server, and also under ONE Google Analytics account, will that negatively impact with Google? They all have different content and addresses, some have the same template, but with different images.
Intermediate & Advanced SEO | | BBuck1 -
Google Places
If you rank on google places, I have noticed that you do not rank on the front page as well. I have a site that ranks on front page for it's keywords; however, because they are (1) on google places, they don't show up when someone is localized to that area. They show up on google places but not on front page. If you turn of localization, they are first in serps. How can I get around this? Two separate sites? One for Google+ (Places) and one for SERPS?
Intermediate & Advanced SEO | | JML11790 -
Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
I work on a large news site that is constantly being crawled by Google. Googlebot is hitting the homepage every 5-10 minutes. We are in the process of moving to a new CMS which has left our sitemap nonfunctional. Since we are getting crawled so often, I've met resistance from an overwhelmed development team that does not see creating sitemaps as a priority. My question is, are they right? What are some reasons that I can give to support my claim that creating an xml sitemap will improve crawl efficiency and indexing if we are already having new stories appear in Google SERPs within 10-15 minutes of publication? Is there a way to quantify what the difference would be if we added a sitemap?
Intermediate & Advanced SEO | | BostonWright0