How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple H2 with no direct links to content and invisible body text - is this an issue?
OK, so we've lost pagerank and I think it's because of the way our site works (and we operate it). We have a videofolio, which shows on most of our pages, showcasing our work. Over time, we have tended to unpublish these pages and created new videofolio pages to show on our home page and relevant pages. The videofolio is a set of pages, each with a title, body text and a place to insert a link to the video, which is played through a videofolio showcase on our website (www.curveball-media.co.uk). Each is set a category, e.g. film, and when the user clicks the tab for 'film', the thumbnails pop up and the user can play the video. We have to work it this way as it's the only way to remove the videos from showing on our home page and to show new content instead. Simply deselecting a category still allows the videos to be seen when the 'all' category is selected by the user. Last week, I found a way of bringing back these unpublished pages by removing the 'all' tab from the videofolio. Then I turned each one into a blog like structure instead. Essentially, instead of the video link being played through the videofolio, we deselected a category (e.g. animation, film etc) and left the page floating. The only way you can access it without being attached to a videofolio category is through the direct link. By turning off the 'all' category and deselecting the page from any other categories, we were able to properly SEO these pages. NB: If they are created for use with the videofolio, you can have only extremely limited body text and no H2, as this is the text that appears when you hover over the video thumbnail. That's just the nature of the template. What I didn't anticipate is that now the code on the home page shows all these now (un)published pages and their corresponding H2 tags. Without a category selected, there is no way to get to these pages unless I create a direct link. I plan to do this through a blog post. In the home page code, the entire videofolio page shows, including the body text and link to the video. **This text doesn't show on the home page though, i.e. the user never sees this text. ** 1. Is it an issue to have so many similar H2 tags on the homepage? 2. Is it an issue that the code has text which is essentially invisible on the home page? 3. Is it an issue that the content is not linked to through the home page visibly? Thanks!
Intermediate & Advanced SEO | | curveballmedia0 -
SEO Question re: Keyword Cannibalization
I know about Keyword Cannibalization, so I understand why it's generally a problem. If you have multiple versions of the same page, Google has to "guess" which one to display (as I understand it, unless you have a SUPER influential page you won't get both pages showing up on the SERP). To explain why I'm not sure if this applies to our page, we have a blog that we write about employment law issues on. So we might have 20 blog posts over the past year that all talk about recent pregnancy discrimination lawsuits employers might be interested in. Now, searching the Google Keyword tools, there aren't even close to 20 different focus keywords that would make any sense. "Pregnancy Discrimination lawsuit" is niche enough for us to be competitive, but anything more specific than that simply has very little search activity. My suggestion is to just optimize all of them for "pregnancy discrimination lawsuit". My understand of how Panda works is that if the content is different on each page (and it is!) then it will only display what it guesses is the most relevant "NLRB" post, but any link juice sent to the other 19 "NLRB" posts would still boost the relevancy for whatever post Google chooses. And it wouldn't get dinged as keyword stuffing because it's clearly not just the same page repeated over and over. I've found quite a few articles on Keyword Cannibalization but many are pre-Panda. I was CERTAIN I'd seen a post that explained my idea is a totally viable and good one, but of course now I can't find it. So before I go full steam ahead with this strategy I just want to make sure there's nothing I'm missing. Thanks!
Intermediate & Advanced SEO | | CEDRSolutions0 -
How to get a page re-crawed quickly
Does anyone know a way to get Google to re-crawl a webpage that does not belong to me. There are a bunch of pages that I have had links removed on and I want Google to re-crawl those pages to see the links have been removed. (current wait time is way way too long) Can anyone suggest some ways to get the page re-crawled. (I am unable to get the website owners to use WMT to do anything). Suggestions like good ping services and various other techniques would be very much appreciated. Thanks
Intermediate & Advanced SEO | | gazzerman10 -
We're indexed in Google News, any tips or suggestions for getting traffic from news?
We have a news sitemap, and follow all best practices as outlined by Google for news. We are covering breaking stories at the same time as other publications, but have only made it to the front page of Google News once in the last few weeks. Does anyone have any tips, recommended reading, etc for how to get to the front page of Google News? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
Transitioning to responsive design--should we re-direct m.?
We are moving to responsive design--should we 301 re-direct all of the old m.domain URLs to the corresponding domain.com
Intermediate & Advanced SEO | | nicole.healthline0 -
Google Re-Index or multiple 301 Redirects on the server?
Over a year ago we moved a site from Blogspot that was adding dates in the URL's (i.e.. blog/2012/08/10/) Additionally we've removed category folders (/category, /tag, etc). Overall if I add all these redirects (from the multiple date options, etc) I'm concerned it might be an overload on the server? After talking with the server team they had suggested using something like 'BWP Google Sitemaps' on our Wordpress site, which would allow Google some time to re-index our site. What do you suggest we do?
Intermediate & Advanced SEO | | seointern0 -
How to Find problem domain history
Hi I have what most of you may think is a dumb question but here goes. please be nice... 🙂 So I have a client (http://www,ace-alarms.co.uk) who are having a real problem ranking for ANY of their key words. I know it's a reasonably competitive area but I've not seen such a stubborn domain and it seems that no matter what we do there's nothing listed. i'm thinking that there may be a problem with the domain name. My question is; how can I find out if this is a problem domain. Thanks in advance Steve
Intermediate & Advanced SEO | | stevecounsell0 -
Is a 301 Direct with a canonical tag Possible ?
Hi All, Quick question , Are we correct in thinking that for any given URL it's not possible to do a 301 redirect AND a canonical tag? thanks Sarah
Intermediate & Advanced SEO | | SarahCollins0