What to do about old urls that don't logically 301 redirect to current site?
-
Mozzers,
I have changed my site url structure several times.
As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site.
I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level.
So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this.
Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe?
Thanks!
-
"So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this."
I would not recommend keeping it that way. You could mass redirect them to the sitemap page if they are passing PR and or some traffic, and there is no logical other place to point them.
404's are not really something that can hurt you, providing that they are coming from external sources and you aren't providing 404 links on your site to dead pages on your site, if there are these, then you should fix the internal links at the source.
-
I dont think 404 errors hurt your site. If you have that many pages, they are most likely crawling your site a lot anyway. Have you set your crawl frequency in your sitemap? On bigger sites that get frequent updates, we set the crawl frequency to daily rather than weekly.
If possible, try to see if there are any top level items you can submit a URL removal request for. Hopefully this can speed up the process fo getting the URL's removed. This process can take a long time for Google to take care of. After changing websites we still had 404 errors after 6 months, even after submitting the URL removal request.
Another option is to have the page render a 410 rather than a 404. A 410 states to the search engine the page is gone, and will not be coming back. If you are using some form of cart system or cms there might be a way to apply the code to a large number of pages at once, rather than trying to manually code 100k pages.
"410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know–or has no facility to determine–whether or not the condition is permanent, the status code 404 (Not Found) should be used instead of 410 (Gone). This response is cacheable unless indicated otherwise."Worse case scenero, you could set them to no-index, or just leave them be. Even if they dont lead anywhere logically, they could still bring you traffic. Or redirect them to the closest thing that is on the site currently.
-
JC,
When you say ...started out 404-ing them...seemed like Google was penalizing my crawl rate..... etc. I have not seen where Google even algorithmically had any real issues with 404's. I your site has 500K pages and 100K are 404'd I do not think it would be a problem for Google per se. (You might have a searcher problem if these were pages that were bookmarked, lots of links, etc.) My caution would be that if you have a lot of pages on the site with links that still go to the 404 pages you could run into UX issues.
For me, I would go with the 404's. I think they will get removed over time.Best
-
When necessary, redirect relevant pages to closely related URLs. Category pages are better than a general homepage.
If the page is no longer relevant, receives little traffic, and a better page does not exist, it’s often perfectly okay to serve a 404 or 410 status codes.
-
You could redirect them to something even remotely relevant even if its the homepage at the end of the day. What ever you do it going to take time and it's going to give you some sort of headache.
What would best suit a user who might land on an old link or somehow get to the page? That would be the best way to find a solution. A good soft 404 or redirect tends to help here.
Best of luck though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old sitemaps after site migration.
Hi, I was wondering if it's safe to remove all the sitemaps from the old site in search console? It's been 3 months since site migration from http://sitea.com (301 redirected) to http://siteb.com. Therefore, can I delete the old sitemap from the http://sitea.com from search console? Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Incoming links which don't exists...
I believe our site is being penalized/held back in rankings, and I think this is why... We placed an advert on a website which they didn't make "no follow" so we had hundreds of site-wide links coming into our site. We asked them to remove the advert which they did. This was 4 months ago, and the links are still showing in GWMT. We have look into their pages which GWMT is saying still link to us, but these a number pages aren't being indexed by Google, and others aren't being cached. Is it possible that because Google cant find these pages, it can tell our link has been removed? And/or are we being penalized for this? Many thanks
Intermediate & Advanced SEO | | jj34341 -
301 Redirected url to new subdomain, now the rank appears to be completely gone...
In an attempt to not feel bad for not blogging, I set up a new subdomain on my site to have a "coming soon" style page and "best of" section for my blog and video show properties. All the pages on the relaunch subdomain are done in Unbounce. http://relaunch.tommy.ismy.name The idea was that I would then take the pages on my regular domain, and one by one create landing pages that test out new design ideas (instead of going into full production web design) and redirect the traffic from the top ranked pages to the new, redesigned pages. At first, I set up the 301 through a plugin in wordpress and for the first week or so it was great. As far as I know, I did set my canonical tags up properly on that page too. However, just a couple days ago, I wasn't getting the same traffic, and my top ranked keyword that accounts for over half my traffic is nowhere to be found in at least the first 15 pages of search results. Which stinks, because I've maintained that rank for well over 2 years 😞 Clearly, something I did wasn't liked by Google, and I wonder, what did I do "wrong" and is there anything I could do to get that rank back?
Intermediate & Advanced SEO | | Thomas_m_walker0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Questions about 301 Redirects
I have about 10 - 15 URLs that are redirecting to http://www.domainname.comwww.domainname.com/. (which is an invalid URL)The website is on a Joomla platform. Does anyone know how I can fix this? I can't figure out where the problem is coming from.
Intermediate & Advanced SEO | | JohnParker27920