4xx fix
-
Hi
I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx.What is the best way of rectifying this issue of my own making?!
Thanks
Gavin -
OK, thanks Dean. I'll update the sitemap and look into rectifying the errors identified by screamingfrog.
Thanks for your assistance!
-
No, I would recommend that you fix the underlying issue. I can see from your sitemap that you still have the URL's with commas in.
Personally I would use screamingfrog.co.uk to find your crawl errors as you will not need to wait a week for the next report.
-
I was waiting for the next crawl as I thought the 4xx would be removed from the crawl diagnostics, however I received a new crawl report today and they are still listed in the report.
I think the simplest way to remove the 4xx would be to create 301s for the URLs. Would you agree? -
So since you tidied the URL's has moz crawled your site again or are you waiting for the next crawl?
-
Where are you seeing the errors being reported? If you have corrected the problem with the error URL's and there are no links to theses URL's then there should not be a problem.
If however you are seeing theses URL's in the search results then yes a 301 redirect would be appropriate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
What is best practice for fixing urls that have duplicate content, non-static and other issues?
Hi, I know there are several good answers regarding duplicate content issues on this website already, however I have a question that involves the best way to avoid negative SEO impacts if I change the urls for an ecommerce site. Basically a new client has the following website http://www.gardenbeauty.co.uk and I notice that it suffers from duplicate content due to the http://www version and the non www version of the pages - this seems quite easy to fix using the guidance on this website. However I notice that the product page urls are far from ideal in that they have several issues including:- (a) they are mostly too long (b) don't include the keyword terms (in terms of best practice) (c) they don't use Static URLS An example of one these product urls would be http://www.gardenbeauty.co.uk/plant-details.php?name=Autumn Glory&p_genus=Hebe&code=heagl&category=hebe I'd like to address these issues, but the pages rank highly for the products themselves, therefore my question is what would you recommend I do to fix the urls without risking the high positions that many of these product pages have? thanks, Ben
Technical SEO | | bendyman0 -
Google Fetch and Render - does this fix penalties?
Ran the fetch and render and came up with two "issues". My specific question is how likely would a link to quantcast (which blocks acces via roberts.txt) really hurt us if fetch and render shows it preventing rendering - which it is not. Thoughts and comments are much appreciated.
Technical SEO | | robertdonnell0 -
Best Way to Fix Dupe Content
We have some internal pages which we have discovered may be causing a duplicate content problem. Does anyone have a recommendation on the best way to fix this? Main page: **http://**bit.ly/ViYqqn Dupe pages: **http://**bit.ly/116uzXe
Technical SEO | | darkgreenguy
**http://**bit.ly/WxyyoW
**http://**bit.ly/TNxPVm
http://bit.ly/VMnbuY Thanks in advance!0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
How to find and fix 404 and broken links?
Hi, My campaign is showing me many 404 problems and other tools are also showing me broken links, but the links they show me dose work and I cant seem to find the broken links or the cause of the 404. Can you help?
Technical SEO | | Joseph-Green-SEO0 -
What is "canonical." And what do I need to do to fix it?
I'm seeing about 450 warnings on this. What is "Using rel=canonical suggests to search engines which URL should be seen as canonical." And what do I need to do to fix it?
Technical SEO | | KimCalvert0 -
Why is 4XX (Client Error) shown for valid pages?
My Crawl Diagnostics Summary says I have 5,141 errors of the 4XX (Client Error) variety. Yet when I view the list of URLs they all resolve to valid pages. Here is an example.
Technical SEO | | jimaycock
http://www.ryderfleetproducts.com/ryder/af/ryder/core/content/product/srm/key/ACO 3018/pn/Wiper-Blade-Winter-18-Each/erm/productDetail.do These pages are all dynamically created from search or browse using a database where we offer 36,000 products. Can someone help me understand why these are errors.0