Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
-
We are building URLs dynamically with apache rewrite.
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page,So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404?
Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).
-
in webmaster tools 404 are going gradually down for desktop and now suddenly going up for mobile for pages that are not linked to for months.
-
I suspect you may be right.
How long ago did they appear?
Are they starting to go gradually down or up?
-
thanks a lot SilverDoor.
The huge number of errors, were caused by a problem with the internal site architecture several months ago which got thousdands of pages in the index that should not exist (and also have no other related relevanted pages). This architecture is fixed now. Still for some reason mobile crawler of webmaster tools is returning now suddenly lots of 404 for mobile view, which I suspect are coming from old 404 in google cache.
-
Hi,
You should never redirect to a 404 page.
Search Engines are going to see this as an error and may even think you are trying to manipulate users around your site.
I would redirect the pages to a relevant page on your site.
With regards to the huge number of errors now appearing, I need some more details to answer the question:
- How many 404's would you expect there to be (give an estimate)?
- How many 301 redirects do you think would have been implemented with the script?
- Have you recently changed anything else with the HTTP Status Codes?
SilverDoor
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors flaring on nonexistent or unpublished pages – should we be concerned for SEO?
Hello! We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere. However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing? Thank you!
Intermediate & Advanced SEO | | DebFF
Chloe0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Problem with redirects in coldfusion
How to redirect pages in cold fusion? If using ColdFusion and modrewrite, the URL will never be redirected from ModRewrite.
Intermediate & Advanced SEO | | alexkatalkin0 -
301 redirect subdirectory to new domain
I'm planning on using 301 redirects to spin out a subdirectory of my current website to be its own separate domain. For instance, I currently have a website www.website.com and my writers write tech news at www.website.com/news. Now I want to 301 redirect www.website.com/news to www.technews.com. Will this have any negative impact on SEO? What are some steps that I can take to minimize these impacts?
Intermediate & Advanced SEO | | Chris_Bishop1 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Is it ok to use both 301 redirect and rel="canonical' at the same time?
Hi everyone, I'm sorry if this has been asked before. I just wasn't able to find a response in previous questions. To fix the problems in our website regarding duplication I have the possibility to set up 301's and, at the same time, modify our CMS so that it automatically sets a rel="canonical" tag for every page that is generated. Would it be a problem to have both methods set up? Is it a problem to have a on a page that is redirecting to another one? Is it advisable to have a rel="canonical" tag on every single page? Thanks for reading!
Intermediate & Advanced SEO | | SDLOnlineChannel0