Should i fix 404 my errors?
-
We have about 250, 404 errors due to changing alot of page names throughout our site.
I've read some articles saying to leave them and eventually they will go away. Normally I would do a 301 redirect.
What's the best solution?
-
I would do this. I personally don't think 250 404's is too much to knock out within a few hours. If you were in the thousands it would be a different concern.
Use excel to make this faster and 301
-
You can see the corresponding correct page against the 404 error page
Been the number of pages are high as you said, Make an entry in excel to note down all these 404 error pages and corresponding new page on the site
Then simply apply 301 redirect to the corresponding pages
-
If it's at all possible to redirect to a relevant page, I would do that. If the 404s are for pages that are no longer relevant, or old content you really don't want people to be able to find any more, then a 404 is just fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 6 URL errors in GSC showing a 500 error code. How do I fix?
I am not sure how to fix some errors that are popping up in Google Search Console. The response codes showing are all: 500 error code I need some advice as to how to fix these. What are my options?
Intermediate & Advanced SEO | | pmull0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Getting too many links on Google search results, how do I fix?
I'm a total newbie so I apologize for what I am sure is a dumb question — I recently followed Moz suggestions for increasing visibility on my site for a specific keyword by including that keyword in more verbose page descriptions for multiple pages. This worked TOO well as now that keyword is bringing up too many results in Google for these different pages on my site . . . is there a way to compile them into one result with the subpages like for instance, the attached image for a search on Apple? Do I need to change something in my robots.txt file to direct these to my main page? Basically, I am a photographer and a search for my name now brings up each of my different photo gallery pages in multiple results, it's a little over the top. Thanks for any and all help! CNPJZgb
Intermediate & Advanced SEO | | jason54540 -
How to find all 404 deadlinks - webmaster only allows 1000 to be downloaded...
Hi Guys I have a question...I am currently working on a website that was hit by a spam attack. The website was hacked and 1000's of adult censored pages were created on the wordpress site. The hosting company cleared all of the dubious files - but this has left 1000's of dead 404 pages. We want to fix the dead pages but Google webmaster only shows and allows you to download 1000. There are a lot more than 1000....does any know of any Good tools that allows you to identify all 404 pages? Thanks, Duncan
Intermediate & Advanced SEO | | CayenneRed890 -
Should I remove all meta descriptions to avoid duplicates as a short term fix?
I’m currently trying to implement Matt Cutt’s advice from a recent YouTube video, in which he said that it was better to have no meta descriptions at all than duplicates. I know that there are better alternatives, but, if forced to make a choice, would it be better to remove all duplicate meta descriptions from a site than to have duplicates (leaving a lone meta tag description on the home page perhaps?). This would be a short term fix prior to making changes to our CMS to allow us to add unique meta descriptions to the most important pages. I’ve seen various blogs across the internet which recommend removing all the tags in these circumstances, but I’m interested in what people on Moz think of this. The site currently has a meta description which is duplicated across every page on the site.
Intermediate & Advanced SEO | | RG_SEO1 -
Strange URLs, how do I fix this?
I've just check Majestic and have seen around 50 links coming from one of my other sites. The links all look like this: http://www.dwww.mysite.com
Intermediate & Advanced SEO | | JohnPeters
http://www.eee.mysite.com
http://www.w.mysite.com The site these links are coming from is a html site. Any ideas whats going on or a way to get rid of these urls? When I visit the strange URLs such as http://www.dwww.mysite.com, it shows the home page of http://www.mysite.com. Is there a way to redirect anything like this back to the home page?0 -
How to identify 404 that get links from external sites (but not search engines)?
one of our site had a poor site architecture causing now about 10.000s of 404 being currently reported in google webmaster tools. Any idea about easily detecting among these thousands of 404, which ones are coming from links from external websites (so filtering out 404 caused by links from our own domain and 404 from search engines)? crawl bandwidth seems to be an issue on this domain. Anything that can be done to accelerate google removing these 404 pages from their index? Due to number of 404 manual submission in google wbt one by one is not an option.
Intermediate & Advanced SEO | | lcourse
Or do you believe that google automatically will stop crawling these 404 pages within a month or so and no action needs to be taken? thanks0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0