404s in GWT - Not sure how they are being found
-
We have been getting multiple 404 errors in GWT that look like this: http://www.example.com/UpdateCart.
The problem is that this is not a URL that is part of our structure, it is only a piece. The actual URL has a query string on the end, so if you take the query string off, the page does not work.
I can't figure out how Google is finding these pages. Could it be removing the query string?
Thanks.
-
Kelli - the first thing I thought was what garfield_disliker asks: have you set up Google Webmaster Tools to ignore these parameters that are important for the cart page to load?
That said, Google Webmaster Tools is run by a team that's separate from the primary search team, so it's possible that GWT is flagging an issue that isn't an actual issue for Google. Run a search in Google for "site:yourdomain.com/UpdateCart" and see what URLs Google has indexed. If they have that 404ing URL, that's not good. If they have correct URLs, it's possible that this is a Google Webmaster Tools thing.
-
Hi,
Are you using the /updateCart url in goal tracking or pushing events to analytics using this url? I have seen GWT pick up 404's from us pushing virtual (non existing) page views to analytics for goal tracking etc. Just a thought.
-
First, you can never be sure there are no external links. Open Site Explorer's index (and any other link analysis tool) is not a full picture, and Google doesn't always provide all the inbound links to your site. The junkier the scraper, the less likely you will see the link.
Secondly, could you provide a concrete example of this?
Where is the page (with parameters) linked from/to on your site? How is your site appending those parameters to the URL? Does it send users through a redirect to get to that URL? It might be useful to run your own crawl (w/ Screaming Frog or any other crawling software) of the site and take a look at all the internal links and the response codes.
Also have you set up Google WMT to ignore any parameters?
It's certainly possible that Google's crawlers are stripping parameters on their own.
-
We do not dynamically inject canonicals into the page. They are also not old URLs because they have never been valid URLs.
They are all linked from internal pages, but when I look at those pages, the URL with the query string is the only URL that is being pointed to, not the partial URL. There are no external links.
Thanks,
Kelli -
In WMT click on the URL that is 404'd and then select "linked to from". It will show you where Google is picking up the 404 error.
Are these 404 pages being linked to from an external site? Sometimes the 404s that appear in WMT are from links pointing to your domain from an external site, often one that has scraped your site.
-
Does your website dynamically inject canonical links into the page? Some content management systems will automatically generate canonicals that strip parameters from the URL. If that's the case then that might be why you wouldn't see it in your ordinary site structure.
It's also possible that it's an old URL that Google indexed which is no longer on your site or something that is linked externally somewhere, so the crawlers are finding it somewhere off site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam pages being redirected to 404s but sill indexed
Client had a website that was hacked about a year ago. Hackers went in and added a bunch of spam landing pages for various products. This was before the site had installed an SSL certificate. After the hack, the site was purged of the hacked pages and and SLL certificate was implemented. Part of that process involved setting up a rewrite that redirects http pages to the https versions. The trouble is that the spam pages are still being indexed by Google, even months later. If I do a site: search I still see all of those spam pages come up before most of the key "real" landing pages. The thing is, the listing on the SERP are to the http versions, so they're redirecting to the https version before serving a 404. Is there any way I can fix this without removing the rewrite rule?
Technical SEO | | SearchPros1 -
Search Console has found over 18k 404 errors in my site, should I redirect?
most of them where old URLs pointed from a really old domain, that we have just shutten down. If the pages didn't receive any traffic, should we redirect? If I follow this https://moz.com/learn/seo/http-status-codes we shouldn't
Technical SEO | | pablo_carrara0 -
What is the best way to correct GWT telling me I have mobile usability errors in Image directories
In GWT, I wish to remove / resolve the following errors Mobile Usability > Viewport not configured Mobile Usability > Small font size Mobile Usability > Touch elements too close The domain www.sandpiperbeacon.com is responsive, and passes the mobile usability test. A new issue I noticed, is that GWT is reporting 200+ errors just for image index pages such as http://www.sandpiperbeacon.com/images/special-events/ for example. Website users cannot access these pages (without editing the URL manually) so I don't consider these usability issues. BUT, I hate to see 200+ errors, especially when Google itself says "Websites with mobile usability issues will be demoted in mobile search results." I could set the image directories to dissalow in Robots.txt, but I do not want the images to stop appearing in image search, so this seems like a flawed solution. I cannot be the only person experiencing this, but I have been unable to find any suggestions online.
Technical SEO | | RobertoGusto0 -
404s effecting crawl rate?
We made a change to our site where we all of a sudden we are creating a large number of 404 pages. Is this effecting the crawl/indexing rate? Currently we've submitted 3.4 million pages, have over 834K indexed but have over and 330K pages not found. Since the large increase in 404s we've noticed a decrease in pages crawled per day. I found this Q & A in Webmasters (http://googlewebmastercentral.blogspot.com/2011/05/do-404s-hurt-my-site.html) but it seems like the 404s should not have an effect. Is this article out of date? What do you think fellow Moz-ers? Is this a problem?
Technical SEO | | JoshKimber0 -
Duplicate title error in GWT over spelling in URL
Hi, How do I resolve a duplicate title error in GWT over spelling in URL? Ttile of Post: Minneapolis Median Home Sales Price Up 16 Percent Not sure how this happened, but I have two URL versions show up. Even with a 301 redirect, the both remain an error in GWT. /real-estate-blog/Minneapolis-median-home-sales-price-up-16-percent and /real-estate-blog/minneapolis-median-home-sales-price-up-16-percent
Technical SEO | | jessential0 -
Lots of backs links from Woorank reported by GWT
Hello We just sow a lots of links from woorank website ( 138 ) reported in our "Links to your site" at google webmaster tools, do you think we should consider add to submit that website for disavow in google webmaster tools ? Rgds
Technical SEO | | helpgoabroad0 -
Time to deindexing: WMT Request vs. Server not found
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
Technical SEO | | erin_soc0 -
Removing GWT Geo-Targeting
Hi, if I have a website in Google Webmaster Tools > Geographic Target set to United States, but I want to promote it in other specific countries, 1. Do I have to change the geo-targeting in GWT so it does not target a specific country? (discarding the option of adding sub-domain to GWT) 2. If I remove the GWT geo-targeting would it affect my rankings in the US? Thank you
Technical SEO | | andresgmontero0