404 Error Complications
-
Hello Moz World!
I am receiving a 404 error on one of my webpages. When I directly input the URL into my search bar I receive a 404 error. However, when I am on my website and link over to the broken webpage from my website I do not receive an error. The page will show up with no issues, and the address in the URL is the address that is receiving the 404 error. i.e. www.mywebsite.com/services
Does anyone know how i should go about troubleshooting this issue? Any suggestions on how I can resolve this? To me, I would think that if the link is not broken when being directed from the website, it shouldn't be broken when entering the url directly into the search bar. Right?
Any info/advice is appreciated.
B/R
Will
-
AHA! I had a feeling is was something like that. Glad to help!
-
Hello Logan,
Thank you for the response. The URI Valet Tool was helpful. I found my error. In all instances except for one I had; www.mywebsite.com/My-Awesome-Services.html and one instance of www.mywebsite.com/my-awesome-services.html.
I did not realize how important consistency is when it comes to capitalization in a website's URL, Lesson learned. Thanks Again Logan!
B/R
Will H.
-
Hi Will,
It's hard to say without seeing exactly what the format of the broken and working URL is. When you click the link and you get the working URL, do you take that exact URL when entering directly? Or are you typing it in?
URI valet might help you troubleshoot this further. Or if you want to DM me the URL, I can take a look and see if I can identify the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
News Errors In Google Search Console
Years ago a site I'm working on was publishing news as one form of content on the site. Since then, has stopped publishing news, but still has a q&a forum, blogs, articles... all kinds of stuff. Now, it triggers "News Errors" in GWT under crawl errors. These errors are "Article disproportionately short" "Article fragmented" on some q&a forum pages "Article too long" on some longer q&a forum pages "No sentences found" Since there are thousands of these forum pages and it's problem seems to be a news critique, I'm wondering what I should do about it. It seems to be holding these non-news pages to a news standard: https://support.google.com/news/publisher/answer/40787?hl=en For instance, is there a way and would it be a good idea to get the hell out of Google News, since we don't publish news anymore? Would there be possible negatives worth considering? What's baffling is, these are not designated news urls. The ones we used to have were /news/title-of-the-story per... https://support.google.com/news/publisher/answer/2481373?hl=en&ref_topic=2481296 Or, does this really not matter and I should just blow it off as a problem. The weird thing is that we recently went from http to https and The Google News interface still has us as http and gives the option to add https, which I am reluctant to do sine we aren't really in the news business anymore. What do you think I should do? Thanks!
Intermediate & Advanced SEO | | 945010 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
How to fix Invalid Product Page registering as Soft 404
Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?
Intermediate & Advanced SEO | | ntsupply0 -
Www vs. non-www differences in crawl errors in Webmaster tools...
Hey All, I have been working on an eCommerce site for a while that to no avail, continues to make me want to hang myself. To make things worth the developers just do not understand SEO and it seems every change they make just messes up work we've already done. Job security I guess. Anywho,most recently we realized they had some major sitemap issues as almost 3000 pages were submitted by only 20 or so were indexed. Well, they updated the sitemap and although all the pages are properly indexing, I now have 5000+ "not found" crawl errors in the non-www version of WMT and almost none in the www version of the WMT account. Anyone have insight as to why this would be?
Intermediate & Advanced SEO | | RossFruin0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Access denied errors in webmaster tools
I notice today Ihave 2 access denied errors. I checked the help which says: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Therefore I think it may be because I have added a login page for users and googlebot can't access it. I'm using wordpress and presume I need to amend the robots.txt to remove the requirement for google to log in but how do I do that? Unless I'm misunderstanding the problem altogether!
Intermediate & Advanced SEO | | SamCUK0 -
Crawl errors in GWT!
I have been seeing a large number of access denied and not found crawl errors. I have since fixed the issued causing these errors; however, I am still seeing the in webmaster tools. At first I thought the data was outdated, but the data is tracked on a daily basis! Does anyone have experience with this? Does GWT really re-crawl all those pages/links everyday to see if the errors still exist? Thanks in advance for any help/advice.
Intermediate & Advanced SEO | | inhouseseo0 -
404'd pages still in index
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
Intermediate & Advanced SEO | | mj7750