Webmaster Tools 404 Errors Pages Never Created
-
Recently, 196 404 errors appeared in my WMT account for pages that were never created on my site.
Question: Any thoughts on how they got there (i.e. WMT bug, tactic by competitor)?
Question: Thoughts on impact if any?
Question: Thoughts on resolution?
-
Run an SEOmoz campaign and look there for the referring URLs?
-
Thanks for the responses. Unfortunately, I can't see the referring URL.
-
Most of the time in GWT you can see the page where the referring link is coming from. The crawl errors in GWT are links to your site, so if people use an invalid URL to link to your site, it'll show up as a 404 error in GWT.
In my experience when this happens en masse, some spammy site goes a little nuts and tries to link to a bunch of pages, but goofs something up to make the URLs 404s, like putting a extra space which renders as a %20, in the href at the end of the URL.
If it's a common mistake, you might considering putting a rewrite rule in to 301 redirect those 404 URLs to valid pages on your site. You could also contact the webmaster of that site to fix the URL or URLs that are invalid (especially if it would be a valuable link). Lastly, you can do nothing and ignore the errors.
-
You should be able to see the referring URL for the 404 pages in your GWT account. If someone links to you and does a typo on the end of the URL, or puts your URL in parenthesis and the final ) gets hyperlinked, that can cause a 404.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links in Webmaster Tools that aren't really linking to us
I've noticed that there is a domain in WMT that Google says is linking to our domain from 173 different pages, but it actually isn't linking to us at all on ANY of those pages. The site is a business directory that seems to be automatically scraping business listings and adding them to hundreds of different categories. Low quality crap that I've disavowed just in case. I have hand checked a bunch of the pages that WMT is reporting with links to us by viewing source, but there's no links to us. I've also used crawlers to check for links, but they turn up nothing. The pages do, however, mention our brand name. I find this very odd that Google would report links to our site when there isn't actually links to our site. Has anyone else ever noticed something like this?
Technical SEO | | Philip-DiPatrizio0 -
Duplicate Page Title Error passing a php variable
Hi i've searched about this and read about this and i can't get my head around it and could really do with some help. I have a lot of contact buttons which all lead to the same enquiry form and dependant on where it has come from it fills in the enquiry field on the contact form. For example if you are on the airport transfer page it will carry the value so its prefilled in (.php?prt=Airport Transfers). The problem is it's coming up as a duplicate page however its just the 1. I have this problem with quite a few sites and really need to combat this issue. Any help would be very much appreciated. airport-transfers.php
Technical SEO | | i7Creative0 -
Is creating backlinks to Google places pages worth the time and money involved?
I have worked on a website and organically it is starting to do fine. The website itself is on the right track. Now, the places page, could use a little improvement. I did make sure it has the right categories, has all unique pictures and videos, it does have a good amount of reviews and even citations from other local directories, and even the website links to it. It does show up for some local searches but I would like it to dominate more. I've heard that if I've built links to that Google Places local page from other sources, it would rank higher and perform better. Is that true? Any other tips and tricks to make it perform better? Thank you
Technical SEO | | Boogily0 -
How to remove crawl errors in google webmaster tools
In my webmaster tools account it says that I have almost 8000 crawl errors. Most of which are http 403 errors The urls are http://legendzelda.net/forums/index.php?app=members§ion=friends&module=profile&do=remove&member_id=224 http://legendzelda.net/forums/index.php?app=core&module=attach§ion=attach&attach_rel_module=post&attach_id=166 And similar urls. I recently blocked crawl access to my members folder to remove duplicate errors but not sure how i can block access to these kinds of urls since its not really a folder thing. Any idea on how to?
Technical SEO | | NoahGlaser780 -
Duplicate page content errors in SEOmoz
Hi everyone, we just launched this new site and I just ran it through SEOmoz and I got a bunch of duplicate page content errors. Here's one example -- it says these 3 are duplicate content: http://www.alicealan.com/collection/alexa-black-3inch http://www.alicealan.com/collection/alexa-camel-3inch http://www.alicealan.com/collection/alexa-gray-3inch You'll see from the pages that the titles, images and small pieces of the copy are all unique -- but there is some copy that is the same (after all, these are pretty much the same shoe, just a different color). So, why am I getting this error and is there any best way to address? Thanks so much!
Technical SEO | | ketanmv
Ketan0 -
All of my incoming links to my site are gone in Webmaster Tools!?
I just checked webmaster tools and noticed that all of the links I have acquired over the last few months are gone except for 1 website. Did something change just recently? Is this a glitch? http://www.petmedicalcenter.com Thanks in advance for your help! Brant
Technical SEO | | PMC-3120870 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0