4xx fix
-
Hi
I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx.What is the best way of rectifying this issue of my own making?!
Thanks
Gavin -
OK, thanks Dean. I'll update the sitemap and look into rectifying the errors identified by screamingfrog.
Thanks for your assistance!
-
No, I would recommend that you fix the underlying issue. I can see from your sitemap that you still have the URL's with commas in.
Personally I would use screamingfrog.co.uk to find your crawl errors as you will not need to wait a week for the next report.
-
I was waiting for the next crawl as I thought the 4xx would be removed from the crawl diagnostics, however I received a new crawl report today and they are still listed in the report.
I think the simplest way to remove the 4xx would be to create 301s for the URLs. Would you agree? -
So since you tidied the URL's has moz crawled your site again or are you waiting for the next crawl?
-
Where are you seeing the errors being reported? If you have corrected the problem with the error URL's and there are no links to theses URL's then there should not be a problem.
If however you are seeing theses URL's in the search results then yes a 301 redirect would be appropriate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
How to fix: Attribute name not allowed on element meta at this point.
Hello, HTML validator brings "Attribute name not allowed on element meta at this point" for all my meta tags. Yet, as I understand, it is essential to keep meta-description for SEO, for example. I read a couple of articles on how to fix that and one of them suggests considering HTML5 custom data attribute instead of name: Do you think I should try to validate my page? And instead of ? I will appreciate your advise very much!
Technical SEO | | kirupa0 -
How to fix duplicate content caused by tags?
I use SEMRush, and the issue they are finding is I have 30 duplicate content issues. All seem to be caused by the tags I add in my portfolio pieces. I have looked at my SEO settings (taxonomies, etc) in the Wordpress site, and don't know what I am doing wrong....any advice how to fix? I have attached a screen shot VsYv2wY
Technical SEO | | cschwartzel0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
Fixing Crawl Errors
Hi! I moved my Wordpress blog back in August, and lost much of my site traffic. I recently found over 1000 crawl errors in Webmaster Tools because some of my redirects weren't transferred, so we are working on fixing the errors and letting Google know. I'm wondering how long I should expect for Google to recognize that the errors have been fixed and for the traffic to start returning? Thanks! Jodi - momsfavoritestuff.com
Technical SEO | | JodiFTM0 -
Our homepage currently uses a Meta refresh. Is it worth $1,000 to get it fixed?
Look at http://www.ccisolutions.com After the meta refresh takes place the homepage URL looks like this: http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain I am trying to convince management that it is worth spending $1,000 with our current provider to get it fixed. It is my understanding that this meta refresh could be preventing the value of our homepage from being passed down to our category pages, etc. Can anyone give me something concrete that I can use to convince management that the fix is worth $1,000? Or is it not worth fixing?
Technical SEO | | danatanseo0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0