How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any problem if we migrate the entire site to HTTPS except for the blog ?
Hello guys,
Intermediate & Advanced SEO | | newrankbg
I have a question to those of you, who have migrated from HTTP to HTTPS. We are planning to migrate the site of our customer to Always SSL. In other words, we want to redirect all site pages to HTTPS, except for the blog. Currently, the whole site is using the HTTP protocol (except the checkout page).
After the change, our customer's site should look like this: https://www.domain.com
http://www.domain.com/blog/ The reasons we do not want to migrate the blog to HTTPS are as follows: The blog does not collect any sensitive user information, as opposed to the site. We all know that on-site algorithms like Panda are having sitewide effect. If the Panda doesn’t like part of the blog (if any thin or low quality content), we do not want this to reflect on the rankings of the entire website. Having in mind that for Google, HTTP and HTTPS are two different protocols, a possible blog penalty should not reflect the web site, which will use HTTPS. Point 2 is the reason I am writing here, as this is just a theory. I would like to hear more thoughts from the experts here. Also, I would like to know your opinion, regarding this mixed use of protocols – could this change lead to a negative effect for any of the properties and why? For me, there should be no negative effect at all. The only disadvantage is that we will have to monitor both metrics – the blog and the site separately in webmaster tools. Thank you all and looking forward for your comments.0 -
Google Hangout/YouTube Videos- How to re-market?
I've created multiple high quality Google Hangout videos (now stored as YouTube videos) with a client. Does it make sense to download these videos and re-post to third party sources like Vimeo, DailyMotion,etc. or is this considered duplicative content and no additional G value will apply? I know I have some excellent content in these videos and would like to hear from someone with experience on promoting raw video footage, outside of the YouTube format. Have you had success? Thanks!
Intermediate & Advanced SEO | | mgordon0 -
Weird indexing problem - Can it be solved?
Hi Been building and optimising sites for 15 years and this is one of the hardest problems I ever came across. So any help would be very much appreciated. Here we go: For some mysterious reason this URL http://weekend.visitsweden.com/no/ has been indexed as http://weekend.visitsweden.com even if we tried all we can to correct it. The problem is that since the latter points to the first URL with a 301 it refuses to get any page rank. Also it does not get visible in Google at all. Just a recap of what we have tried so far: Add site to webmaster tools Add proper sitemap.xml Add 301 redirect to the correct URL An easy way to locate the problem is to search for the main content of the site. As you can see it returns the wrong URL and the correct URL does not even get listed. Again, any help is very much appreciated. Kind regards Fredrik
Intermediate & Advanced SEO | | Resultify0 -
We're indexed in Google News, any tips or suggestions for getting traffic from news?
We have a news sitemap, and follow all best practices as outlined by Google for news. We are covering breaking stories at the same time as other publications, but have only made it to the front page of Google News once in the last few weeks. Does anyone have any tips, recommended reading, etc for how to get to the front page of Google News? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
You're given 10,000 recipes and told to build a site--what would you do?
Say you were given a list of 10,000 recipes and asked to build an SEO friendly site. Would you build a recipe search engine and index the search results (of course making sure that IA and user engagement metrics are great)? Or, would you try to build static pages?
Intermediate & Advanced SEO | | nicole.healthline0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Problem with 404 and 500 Status code pages
Dear SeoMozzers, I have a question related to one of the sites I have recently changed the URL, going from http:example.com to http://www.example.com I did 301 redirects, as I was recommended to do. In the past month I have noticed an incredible drop in Google's rankings for many keywords and checking the crawling errors appearing in the SEO Crawling Report I have witnessed mayhem with Canonical/301 redirect types of errors. Now, things seem a little better. I have noticed a reduction in the number of 301 and Canonical type or errors (by the way, I still do not get the Canonical issue :-)). My little questions are the following: Will I ever go back to the positions I used to occupy before I redesigned the site's URL structure? I have now noticed that the SeoMoz Crawling report show "404 Staus" errors and one "505 Status" error. Can somebody please tell me how to fix the 404 Status Errors? Can I fix them by myself, or maybe I can ask the guys at the web hosting company, since I am really bad at taking care of technical issues? Thank you for the time you took to clarify my doubts. Ad maiora, Sal
Intermediate & Advanced SEO | | salvyy0 -
Problem of indexing
Hello, sorry, I'm French and my English is not necessarily correct. I have a problem indexing in Google. Only the home page is referenced: http://bit.ly/yKP4nD. I am looking for several days but I do not understand why. I looked at: The robots.txt file is ok The sitemap, although it is in ASP, is valid with Google No spam, no hidden text I made a request for reconsideration via Google Webmaster Tools and it has no penalties We do not have noindex So I'm stuck and I'd like your opinion. thank you very much A.
Intermediate & Advanced SEO | | android_lyon0