Tricky 301 question
-
A friend has relaunched a website but his web guys (he didn't consult me!) didn't do any 301s and now traffic unsurprisingly has tanked.
The old site and database no longer exists and there are now 2000+ 404's.
Any ideas how to do the 301s from old urls to new product urls WITHOUT it being a massive manual job?
-
That's my point, you only need to worry about the pages that had external links
Thanks -
Thanks
-
Pages dont just get equity from external links of course. If a category page has 10 links to it the product pages linked to on that page benefit. The wholesale drop in rankings isn't because every page had an external link to it.
-
I don't know what you mean about link equity, if there is no link pointing to the page then there is nothing lost.
As for search engines finding a lot of 404s, they will remove them from the index after a while, no problem there, you are returning the correct status code, that's what they want. This will allow them to clean up there index and stop crawling the pages. -
If the majority of URLs have no logic, then it makes things a bit tricky in regards to minimizing the amount of work.
I once had a very active and large website with about 500-1000 single lines of rewrite code (1 for each URL) in my htaccess. Surprisingly, it did not slow the server down at any noticeable rate, unless you are very sensitive to milliseconds and even then, one trial to the next could easily differ from regular internet congestion. My point is, nobody ever noticed.
Here's a few ways that I would handle this job to get through it as quickly and effortlessly as possible.
The more aggressive and time consuming approach:
I would output all the URLs that were changed from phpmyadmin or whatever mysql administration tool you might use to a spreadsheet. From that spreadsheet, I would add the original URL.
Then with the old URL (A1) and new URL (A2) I would write a formula to output the correct rewrite (A3.) Then simply copy and paste that formula down all the rows that it applies to. You might need to break up the URLs to grab the right pieces for your formula.Of course use, regex where you can, and keep your .htaccess rewrites to a minimum.
If that is still too much work, hire someone to do it through elance.com
The somewhat sloppy pace-yourself-approach:
Another approach you could take is to just monitor google webmaster tools for all the page not found errors. And once a day or once a week, grab those URLS, create the rewrite, and mark it as fixed in webmaster tools.
The reason I say this is somewhat sloppy is because, you might find that you could have used regex in a lot of instances to better handle all those missing URLs.
But it may be a good way of staying on track with google, and handling the issues only as they arise so it does not feel like such a mammoth task.
-
Thanks Alan, yes they have good external links to many pages. They retail a very niche product and have a lot of forum, review, social type links. It might be though if need be they just have to focus mostly on 301s for the pages with those links. As best practise I am in favour of 301'ing regardless of external links as the link equity gets messed up and causes ranking issues, as in this case, as well as sending a signal to the engines about the amount of wasted resource they will use crawling a site with 1000s of 404s.
-
Thanks Donna & Luis. Luis is right i'm looking for a way for this not to be a mammoth manual task for their developer.
-
Thanks, the regex is a good idea and might be part of the solution for some urls at least but there seems to be some discrepancies in logic between old and new product urls and some of the new product urls are actually still the same as the old (which of course is fine).
-
Thanks Luis, unfortunately neither 1 or 2 are ideal.
1. I don't think there is much logic in the change of url structure between old and new product urls which makes that idea impossible.
2. Thats going to be a last resort
Andy
-
do you know if they had any external links?
If they don't have external links then I would just let them 404.
some people have some wired thoughts of what 301's do. They simply redirect a request, so a request o A is told to remake the request to to B, so the crawler will follow it that way and award the pagerank to the new page with a small loss on each request.If no external links what is there to gain? don't complicate your site with unnesasary redirects, there is a small argument that the pages may have been bookmarked at old url, but I think that argument is so weak I would not bother
-
Yeah. I heard him. I guess I'm saying "probably not".
I like how you're keeping us honest though Luis. I don't like it when people respond with what they want to say rather than with an answer to the specific question.
-
Donna,
Andy has been very specific about this: "WITHOUT it being a massive manual job" hehe thanks for supporting my answer.
Luis
-
It really depends on the nature, link and traffic patterns of your site Andy. If the vast majority of those 2,000+ 404's are coming from pages that should never have been indexed in the first place, you can probably get away with Luis's 2nd suggestion. If they're differentiated, valuable, and show evidence of incoming links and traffic, you've got some work ahead of you.
You might be able to streamline the process by inventorying and grouping like pages, then doing group redirects. But I suggest you do some analysis first to determine whether the effort is warranted.
-
2000+ is a lot of URLs to work through. But you can most likely get through them quickly with a few good regular expression 301 redirects in your .htaccess
If you have a pretty consistent form from the old url to the new one, this will be a piece of cake.
ex:
old URL: this/was/coolnew URL: this/is/cool
However, if there is really no rhyme and reason to the newly formed URLs, this could end up taking a considerate amount of time.
I would look into writing 301 redirects with regular expressions in .htaccess (I'm assuming your server is and uses .htaccess)
There are a number of resources for doing this, and even one here at moz.com
https://moz.com/learn/seo/redirection -
Hello Andy,
1. Try this: http://webdesign.about.com/od/htaccess/ht/redirect-an-entire-site-using-htaccess.htm
2. Second/faster solution. You could add this line of code to your .htacess file (and all the current "404's users" will go to the homepage):
ErrorDocument 404 /
But pay attention... 404's are perfectly normal if the page no longer exists, for user experience you should only ever use a 301 redirect if the page that no longer exists is going to a equal page.. i.e about cars to cars, about rabbits to rabbits. Maybe the only solution is creating a 404 specific landing page for this (with links to different sections of your site)
Hope this helps,
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http to https Canonical Question
Hello Fellow Moz Friends I have recently went from http to https for the website. Do I keep my canonicals at http or make all https? Will this affect ranking signals? Anything I should be looking out for? Thank you.
Intermediate & Advanced SEO | | Carwrapsolutions0 -
How to get rid of two 301 redirects?
I have two 301s from http://www. to https://non-www version of my site. I wonder how can get rid of one so it will look like this: 301-200 instead of 301-301-200 All other combinations work fine and give me 301-200 status codes. Thank you very much!
Intermediate & Advanced SEO | | lovemozforever0 -
Simple Link Question
Hi Guys, I will appreciate if you answer 1 small question..... Will our site benefit from that link?
Intermediate & Advanced SEO | | Webdeal
Valuable website related to our business ---nofollow link--> PDF Doc(on second site) ---link to our site ---> Kind Regards,
webdeal0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Should you ever reverse a 301
Hi Im Looking at reversing a 301 i did 5 months ago, I originally changed names of a site for reasons where the site was going to be split into 2 where part of the site i would run and the other where i would get my business partner to run it However since the 301 i have lost 80% of traffic and cannot find the reason as to what especially as everything looks perfect except the rankings, to put it into context 150 keywords i was tracking all were page 5 and below with 40% being page 1 Now only 7 from that same 150 are in the top 10 pages (First 100 results), The issue i have must be very rare as i have posted for help with little or no response, which tells me that the kind people who have looked could not see any issues as this is normally a very helpful community (below is the last thread i asked for help) http://www.seomoz.org/q/301-redirect-how-to-get-those-juices-flowing So 5 months on i am considering removing the 301 and hoping that some kind of normality returns by reinstating the old URL So my question is that is what im about to do wise, do i have any options, is this something you have done in the past if so how did you get on Thanks in advance
Intermediate & Advanced SEO | | kellymandingo0 -
Setting Up Google Analytics for domains with 301
I have a client with a google analytics account that is a mess. domaina.com domainb.com 302's to domaina.com domainc.com 302's to domaina.com domaind.com 302's to domaina.com I thought the client was doing 301s on all these domains to the primary domain. I have logged into there analytics account and found data is being tracked on the other domains i.e domainb,c,d.com etc. How is it possible that google analytics is tracking data on these domains when no analytics code has been created and the urls are redirecting to domaina.com? Also there are not sites on these domains so for webmaster tools should I enable domain verification through a cname on the dns? Also I can I best setup a way to track traffic coming from say domainb.com? Whats the best step by step guide to use to set this up.
Intermediate & Advanced SEO | | JohnW-UK0 -
301 redirect help
Hey guys, I normally work in WordPress and just use a 301 redirect plugin. I bought a site and rather than maintain two similar ones have decided to redirect one to the other. I am having trouble with the .htaccess file. Here is an example. These are two redirects: redirect 301 /category/models/next/2
Intermediate & Advanced SEO | | DanDeceuster
redirect 301 /category/models I want both of these URLs to redirect to the same URL of the new site. However, the /category/models is the only one working. It redirects to the new page just fine. The /category/models/next/2 is redirecting to nearly the same URL on the new site, only it is adding /next/2 to the end and that is bringing up a 404. Why is it adding /next/2 to the new URL? How can I fix this? There are several doing this. Help appreciated!0 -
How to 301 redirect ASP.net URLS
I have a situation where a site that was ASP.net has been replaced with a WordPress site. I've performed a Open Site Explorer analysis and found that most of the old pages, ie www.i3bus.com/ProductCategorySummary.aspx?ProductCategoryId=63 are returning a HTTP Status = NO DATA ... when followed ends up at the 404 catch-all page. Can I code the standard 301 Redirects in the .htaccess file for these ASP URLs? If not, I'm open to suggestions.... Thanks Bill
Intermediate & Advanced SEO | | Marvo0