External 404 vs Internal 404
-
Which one is bad?
External - when someone adds an incorrect link to your site, maybe does a typo when linking to an inner page. This page never existed on your site, google shows this as a 404 in Webmaster tools.
Internal - a page existed, google indexed it, and you deleted it and didnt add a 301.
Internal ones are in the webmaster's control, and i can understand if google gets upset if it sees a 404 for a URL that existed before, however surely "externally created" 404 shoudnt cause any harm cause that page never existed. And someone has inserted an incorrect link to your site.
-
What exactly do you mean by "bad?"
They are both undesired. You certainly won't be penalized for having a 404 specifically, but you may see a decrease in ranking if it affects your site structure or if the pages you lost were passing along page authority to other pages.
You should do everything you can to limit both. Both are equally bad.
Let me know if I understood the question correctly. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404's and Ecommerce - Products no longer for sale
Hi We regularly have products which are no longer sold and discontinued. As we have such a large site, webmaster tools regularly picks up new 404's. These 404 pages aren't linked to from anywhere on the site any longer, however WMT will still report them as errors. Does this affect site authority? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Redirect wordpress.com and internal link ?
Hi Moz Fans, First of all, I need to say thanks to all of answer to previous post. And today i also have the another question that similar to that post. Because our website using Wordpress.org as our CMS for blog post then easier to redirect by point to new site, According to setting site URL ? However in our each blog articles also have anchor text as internal link that link to another blog post, Which mean those link will be automatic redirect to new URL. So once Google bot re-crawl our website when we tell the Google by webmaster tools and the redirection we using 301. What will be happen when Google Bot crawl those link again We need to changes those link as well Keep same with redirection. Nothing happen
Intermediate & Advanced SEO | | ASKHANUMANTHAILAND0 -
Site wide external links analysis tool?
Hi Guys, I just got a remove url email from someone asking us to remove their link. What website or tool is best to see ALL of your external links sitewide from your website? And as a bonus, columns of "nofollow" and "follow". Thank you!
Intermediate & Advanced SEO | | Shawn1240 -
Internal Search Results Appear in Google SERPS
My friend is running an ecommerce store selling apparels. How can we make internal search results to appear in Google SERPS and rank them? For example: the query is "peplum dress". You type the query into the internal search box and it returns a set of results. In this case, it's product listing. How can we optimize and rank it so it appears in Google SERP? Do we do it the traditional way in terms of links? Say URL is: http://www.asos.com/search/peplum-top?q=peplum+top&r=2 And we build links to it? Some of you may ask why not create a dedicated page for this, the reason being we'd have too many categories if we were to create one for each. Thoughts?
Intermediate & Advanced SEO | | WayneRooney0 -
Can internal links from a blog harm the ranking of a page?
Here is the situation: A site was moved from its original domain to its new domain, and at the same time, the external wordpress.com blog was moved to a subdirectory, making it an onsite blog. The two pages that rank the highest on the site have virtually no links from the blog and no external links, while all the other pages are linked extensively from the blog and have backlinks. Their targeted keywords are not so much easier to rank than the other pages for that to be the sole cause. To confuse the matter even more, there was a manual penalty affecting incoming links which was removed last month. The old site, which has many backlinks to the new site, is still in Google's index. The old blog however, has been redirected page by page and is not in Google's index. Most of the blog posts are short 1-paragraph company updates and potentially considered low quality content because of that (?) The common denominator among the two highest ranked pages (I'm talking top 3 in SERP v. page 3 or 4) seems to be either the lack of external backlinks or the lack of internal links from the blog. Could there be an issue with the blog such that internal links from it are detrimental rather than helpful?
Intermediate & Advanced SEO | | kimmiedawn0 -
How should I go about repairing 400,000 404 error pages?
My thinking is to make a list of most linked to and most trafficked error pages, and just redirect those, but I don't know how to get all that data because i can't even download all the error pages from Webmaster Tools, and even then, how would i get backlink data except by checking each link manually? Are there any detailed step-by-step instructions on this that I missed in my Googling? Thanks for reading!!
Intermediate & Advanced SEO | | DA20130 -
Is there a tool that lists all external followed URLs?
Is there a tool that lists all external followed URLs? Or maybe separates nofollowed and followed external URLs?
Intermediate & Advanced SEO | | MangoMan160 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0