How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Does anyone have a clue about my search problem?
After three years of destruction, my site still has a problem - or maybe more than one. OK, I understand I had - and probably still have - a Panda problem. The question is - does anyone know how to fix it, without destroying eveything? If I had money, I'd gladly give it up to fix this, but all I have is me, a small dedicated promotions team, 120,000+ visitors per month and the ability to write, edit and proofread. This is not an easy problem to fix. After completing more than 100 projects, I still haven't got it right, in fact, what I've done over the past 2 months has only made things worse - and I never thought I could do that. Everything has been measured, so as not to destroy our remaining ability to generate income, because without that, its the end of the line. If you can help me fix this, I will do anything for you in return - as long as it is legal, ethical and won't destroy my reputation or hurt others. Unless you are a master jedi guru, and I hope you are, this will NOT be easy, but it will prove that you really are a master, jedi, guru and time lord, and I will tell the world and generate leads for you. I've been doing website and SEO stuff since 1996 and I've always been able to solve problems and fix anything I needed to work on. This has me beaten. So my question is: is there anyone here willing to take a shot at helping me fix this, without the usual response of "change domains" "Delete everything and start over" or "you're screwed" Of course, it is possible that there is a different problem, nothing to do with algorithms, a hard-coded bias or some penalizing setting, that I don't know about, a single needle in a haystack. This problem results in a few visible things. 1. Some pages are buried in supplemental results 2. Search bots pick up new stories within minutes, but they show up in search results many hours later Here is the site: http://shar.es/EGaAC On request, I can provide a list of all the things we've done or tried. (actually I have to finish writing it) Some Notes: There is no manual spam penalty. All outgoing links are nofollow, and have been for 2 years. We never paid for incoming links. We did sell text advertising links 3-4 years ago, using text-link-ads.com, but removed them all 2 1/2 years ago. We did receive payment for some stories, 3-4 years ago, but all have been removed. One more thing. I don't write much - I'm a better editor than a writer, but I wrote a story that had 1 million readers. the massive percentage of 0.0016% came from you-know-who. Yes, 16 visitors. And this was an exclusive, unique story. And there was a similar story, with half a million readers. same result. Seems like there might be a problem!
Intermediate & Advanced SEO | | loopyal0 -
1 Ecommerce site for several product segments or 1 Ecommerce site for each product segment ?
I am currently struggling with the decision whether to create individual ecommerce sites for each of 3 consumer product segments or rather to integrate them all under one umbrella domain. Obviously integration under 1 domain makes link building easier, but I am not sure how far google will favor in rankings websites focussed on one topic=product segment. Product segments are medium competitive.Product segments are not directly related but there may be some overlap in customer demographics- Any thoughts ?
Intermediate & Advanced SEO | | lcourse1 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Does having several long title tags hurt you?
Our title tags are dynamically generated, and some have over 140 characters. Does having a large quantity of URLS with an excessive number of characters hurt you in any way?
Intermediate & Advanced SEO | | nicole.healthline0 -
Merging several domains into one, a redirection question
Hi, We have a client who recently acquired a bunch of domains in all kinds of niches, each domain has a WordPress site on it, with content and backlinks. Clients wants to "merge" all these domains into categories in his "main website", moving content and "moving" backlinks as well. The syntax we came up with is the following: (sample domains of course)
Intermediate & Advanced SEO | | MandLoys
potatomixers.com will be updated with .htaccess 301 redirect:
Redirect 301 /fastpotatomixer.htm http://www.mainwebsite.com/home-appliances/fastpotatomixer 1.) I'm not sure about the domain's root though. Where would I redirect potatomixers.com, to mainwebsite.com/home-appliances? Won't that be a problem that the new one is a "larger" category that has other posts as well, not only potatomixers? 2.) If this gets into the .htaccess (with several other lines for the other content as well of course), won't the first line "override" all the other ones? Redirect 301 / http://www.mainwebsite.com/home-appliances/
Redirect 301 /fastpotatomixer.htm http://www.mainwebsite.com/home-appliances/fastpotatomixer
Redirect 301 /easypotatomixer.htm http://www.mainwebsite.com/home-appliances/easypotatomixer etc. Thanks!0 -
Is it possible to 301 re-direct an entire subfolder on a subdomain?
So, for example, subdomain.domain/folder subdomain.domain/folder2 Is it possible to set up a 301-re-direct for an entire folder on a subdomain? If so, how? What is the correct code for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Nuanced duplicate content problem.
Hi guys, I am working on a recently rebuilt website, which has some duplicate content issues that are more nuanced than usual. I have a plan of action (which I will describe further), so please let me know if it's a valid plan or if I am missing something. Situation: The client is targeting two types of users: business leads (Type A) and potential employees (Type B), so for each of their 22 locations, they have 2 pages - one speaking to Type A and another to Type B. Type A location page contains a description of the location. In terms of importance, Type A location pages are secondary because to the Type A user, locations are not of primary importance. Type B location page contains the same description of the location plus additional lifestyle description. These pages carry more importance, since they are attempting to attract applicants to work in specific places. So I am planning to rank these pages eventually for a combination of Location Name + Keyword. Plan: New content is not an option at this point, so I am planning to set up canonical tags on both location Types and make Type B, the canonical URL, since it carries more importance and more SEO potential. The main nuance is that while Type A and Type B location pages contain some of the same content (about 75%-80%), they are not exactly the same. That is why I am not 100% sure that I should canonicalize them, but still most of the wording on the page is identical, so... Any professional opinion would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | naymark.biz0