4xx fix
-
Hi
I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx.What is the best way of rectifying this issue of my own making?!
Thanks
Gavin -
OK, thanks Dean. I'll update the sitemap and look into rectifying the errors identified by screamingfrog.
Thanks for your assistance!
-
No, I would recommend that you fix the underlying issue. I can see from your sitemap that you still have the URL's with commas in.
Personally I would use screamingfrog.co.uk to find your crawl errors as you will not need to wait a week for the next report.
-
I was waiting for the next crawl as I thought the 4xx would be removed from the crawl diagnostics, however I received a new crawl report today and they are still listed in the report.
I think the simplest way to remove the 4xx would be to create 301s for the URLs. Would you agree? -
So since you tidied the URL's has moz crawled your site again or are you waiting for the next crawl?
-
Where are you seeing the errors being reported? If you have corrected the problem with the error URL's and there are no links to theses URL's then there should not be a problem.
If however you are seeing theses URL's in the search results then yes a 301 redirect would be appropriate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix google index filled with redundant parameters
Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem. I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history. A new customer wanted to improve their site ranking and SEO. We discovered the site was running an old version of Joomla and had been hacked. URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft. There was no notification in webmaster tools that the site had been hacked. So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time. This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions. Problem is that after a couple of months things aren't really improving. The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them. I think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's) The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's. It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible. Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian
Technical SEO | | iragless0 -
How do you fix soft 404 errors?
I'm getting soft 404 errors for a gallery on my site: http://chrle.us/MBKs yet when I visit the url: http://www.chrisboar.com/html/galleries/114/portfolio/weddings-1/0 the image/url is there. Not sure how to fix this. note we do redirect the site to the flash site and only use those URLs for SEO purposes. Maybe that is what is causing it. thanks
Technical SEO | | callmeed0 -
How do I fix a 301 Redirect Loop?
Saturday I waas doing some correcting of some duplicate titles, including nofollowing tags, etc. (my main problem was duplicate titles due to tags and categories being indexed). Now this morning I see that one of my pages refuses to load, citing a 301 redirect loop. http://www.incredibleinfant.com/feeding/switching-baby-formula/ Originally, the page was posted under the wrong category. http://www.incredibleinfant.com/uncategorized/switching-baby-formula I resaved it under the correct category (feeding) and now it won't load. Can someone help me figure out how to correct this mess? Thanks so much Heather
Technical SEO | | Gotmoxie0 -
How can we fix duplicate title tags like these being reported in GWT?
Hi all, I posted this in the GWT Forum on Monday and still no answers so I will try here. Our URL is http://www.ccisolutions.com
Technical SEO | | danatanseo
We have over 200 pages on our site being flagged by GWT as having
duplicate title tags. The majority of them look similar to this: Title: <a>JBL EON MusicMix 16 | Mixer | CCI Solutions</a> GWT is reporting these URLs to have all the same title: /StoreFront/product/R-JBL-MUSICMIX.prod/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Category/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Footer/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Header/StoreFront/product/R-JBL-MUSICMIX.prod?origin=../StoreFront/product/R-JBL-MUSICMIX.prod?origin=GoogleBase These are all the same page. There was a time when we used these origin codes, but we stopped using them over a year ago. We also added canonical tags to every page to prevent us from having duplicate content issues. However, these origin codes are
still showing up in GWT. Is there anything we can do to fix this problem. Do we have a technical issue with our site code and the way Google is seeing our dynamic URLs? Any suggestions on how we can fix this problem? The same is true in our report for Meta descriptions. Thanks
you,
Dana Tan0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Found a Typo in URL, what's the best practice to fix it?
Wordpress 3.4, Yoast, Multisite The URL is supposed to be "www.myexample.com/great-site" but I just found that it's "www.myexample.com/gre-atsite" It is a relatively new site but we already pointed several internal links to "www.myexample.com/gre-atsite" What's the best practice to correct this? Which option is more desirable? 1.Creating a new page I found that Yoast has "301 redirect" option in the Advanced tap Can I just create a new page(exact same page) and put noindex, nofollow and redirect it to http://www.myexample.com/great-site OR 2. htacess redirect rule simply change the URL to http://www.myexample.com/great-site and update it, and add Options +FollowSymLinks RewriteEngine On
Technical SEO | | joony2008
RewriteCond %{HTTP_HOST} ^http://www.myexample.com/gre-atsite$ [NC]
RewriteRule ^(.*)$ http://www.myexample.com/great-site$1 [R=301,L]0 -
4XX Errors - Adding %5c%5c to Links
Hi all 😃 Hope someone can help me with this. The internal links on my hubby's business site occasionally break and add %5c%5c%5c endlessly to the end of the url - like this: site.com/about/hours-of-operation/\\\\\\\\% I cannot for the life of me figure out why it is doing this and while it has happened to me from time to time, I can't recreate it. My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts. Any insight? Thank you!! Jennifer ~PotPieGirl
Technical SEO | | potpiegirl0 -
4xx Client Error
I have 2 pages showing as errors in my Crawl Diagnostics, but I have no idea where these pages have come from, they don't exist on my site. I have done a site wide search for them and they don't appear to be referenced are linked to from anywhere on my site, so where is SEomoz pulling this info from? the two links are: http://www.adgenerator.co.uk/acessibility.asp http://www.adgenerator.co.uk/reseller-application.asp The first link has a spelling mistake and the second link should have an "S" on the end of "application"
Technical SEO | | IPIM0