Can you redirect from a 410 server error? I see many 410s that should be directed to an existing page.
-
We have 150,000 410 server errors. Many of them should be redirected to an existing url. This is a result of a complete website redesign, including new navigation and new web platform. I believe IT may have inadvertently marked many 404s as 410s. Can I fix this or is a 410 error permanent?
Thank you for your help.
-
Oh! Thanks! I will pass that on. As you probably guessed, I am not as well versed in server errors as I would like to be. I am more of an SEO Analyst / Marketing person. I don't actually make changes at the server level. I am learning a lot from the problems that arose out of a website redesign. I am new and trying to get everything fixed.
Again, thank you for your help.
-
The 410 error isn't generated until the page is loaded. The code in htaccess will redirect the crawler to the new page before it has a chance to load the old one.
It's just like if you have a page that doesn't exist, that generates a 404 error. By adding a redirect in the htaccess file, you can point crawlers & visitors to a new URL and not generate a 404.
-
I am just trying to figure out if they can be redirected. From what I have read, labeling something a 410 error is the end. You can't go back and change it to a redirect to a webpage. Is that correct or can we redirect them? I have software engineers that can do the redirects. I just want to make sure that I'm asking them to do something that is possible.
Again, thank you for your help.
-
There must be some kind of logic to your old URLs and your new ones? If you post examples, I can help with the redirect code.
Also, are these pages being linked to from anywhere? If you click on the errors in GWT, it should show you where these pages are being linked from. If the links to these pages no longer exist, they will drop off by themselves eventually.
-
Thank you for your help.
I am trying to get the 410s that are listed in Google Webmaster Tools (Page Not Found server errors) redirected to an actual existing page. What I think happened is that soft 404s were improperly designated by IT as 410s. Almost all of them can be directed to an actual webpage. I would hate to lose that traffic.
There really isn't a common pattern.
This was as a result of implementation of a new design / layout of a website and pages that were not not properly redirected. Now they come in as soft 404 errors.
Again, thank you for your help.
-
Is there a common pattern to be found in the URLs that are generating the 410 errors? Then it should be possible to 301 redirect those URLs via your .htaccess file. Also, Google is crawling those URLs because links to them exist somewhere. If you can remove the links to the pages, it should stop them from getting crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spike in server errors
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch. However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist. Is this something I should worry about? Or let it run its course?
Technical SEO | | absoauto0 -
HTTP Vary:User-Agent Server or Page Level?
Looking for any insights regarding the usage of the Vary HTTP Header. Mainly around the idea that search engines will not like having a Vary HTTP Header on pages that don't have a mobile version, which means the header will be to be implemented on a page-by-page basis. Additionally, does anyone has experience with the usage of the Vary HTTP Header and CDNs like Akamai?Google still recommends using the header, even though it can present some challenges with CDNs. Thanks!
Technical SEO | | burnseo0 -
How does this rank? - a page that is 301 redirected
How does a 301ed page rank in google? In google I searched for" ikea.ca" which is set up as a 301 redirect to www.ikea.com/ca/en and was surprised to see the url --> www.ikea.ca actually ranking. IKEA Canada <cite>ikea.ca/</cite>IKEA Featuring Scandinavian modern style furniture and accessories. Include storage options, lighting, decor products, kitchen appliances and beds. Bedroom - Kitchen - Living Room - IKEA North York
Technical SEO | | Morris770 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
Domain.com and domain.com/ redirect(error)
When I view my campaign report I'm seeing duplicate content/ meta for mydomain.com and mydomain.com/ (with a slash) I already applied a 301 redirect as follows: redirect 301 /index.php/ /index.php Where am I messing up here?
Technical SEO | | cgman0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0