How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Re-direct Irrelevant (high ranking) blog articles?
One of our sites has some old & completely irrelvant blog articles that are high ranking & receive the top two visited pages for the entire site (more page views than the homepage even). Our marketing managers are wanting to take down the blog posts since they are falsely inflating their traffic numbers with irrelevant visitors. I'm concerned by taking them down and re-directing to another site it will affect our overall domain authority and rankings for relevant keywords. Thoughts and/or resources on taking down vs. keeping up?
Intermediate & Advanced SEO | | mfcb0 -
Our company creates cobranded subdomains for our clients; does that hurt our SEO?
We create cobranded websites for local businesses in many towns throughout the United States, always under a subdomain of our main site (e.g., afcreditunion.teachbanzai.com). Does this hurt our SEO rankings? We have a very specific reason for creating these microsites, because it's a high selling point. I've read and watched the material here on Moz regarding subdomains and subfolders, but it doesn't quite answer my question: since we create all these microsites not with the intent of passing authority to our website but with the intent of making their microsite have their branding.
Intermediate & Advanced SEO | | teachbanzai0 -
Best practice to consolidating authority of several SKU pages to one destination
I am looking for input on best practices to the following solution Scenario: I have basic product A (e.g. Yamaha Keyboard Blast) There are 3 SKUs to the product A that deserve their own page content (e.g. Yamaha Keyboard Blast 350, Yamaha Keyboard Blast 450, Yamaha Keyboard Blast 550) Objective: - I want to consolidate the authority of potential links to the 3 SKUs pages into one destination/URL Possible Solutions I can think of: - Query parameters (e.g /yamaha-keyboard-blast?SKU=550) - and tell Google to ignore SKU query parameters when indexing Canonical tag (set the canonical tag of the SKU pages all to one destination URL) Hash tag (e.g. /yamaha-keyboard-blast#SKU=550); load SKU dependent content through javascript; Google only sees the URLs without hashtag Am I missing solutions? Which solutions makes the most sense and will allow me to consolidate authority? Thank you for your input.
Intermediate & Advanced SEO | | french_soc0 -
SEO direction - help needed
Hi, I've been working on a site for about 5 years. We built the traffic up to about 8k visitors/day. Although now it's dropped down over the past 2 years to about 2k visitors a day. New traffic source is mainly from SEO longtail. The whole time we have been working to improve the site. What's the best way to get some help from experts on the right direction to get traffic back up or to at least tell me the site will never work 🙂 Thanks in advance. M
Intermediate & Advanced SEO | | relientmark0 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0 -
Best approach for a client with another site for the same company
I have a client who has an old website and company A handles the SEO campaign for this site.
Intermediate & Advanced SEO | | ao500000
My client wanted us to create a new website with unique content for the same company aiming to double his chances of ranking on the 1st of SERP's and eventually dominating it.
So we created the new site for him and handled it's SEO campaign. So far we are ranking decently on the search engines but we feel like we could do better. The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city.
Do you think Google has a problem with this set up?
We have listed the new site in the citation directories but I'm worried that we are sending google mixed signals. The company has two listing on each directories, one for the old site and another for the new site.
Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack.
What is the best way to approach this mess?
We are looking into ranking for both local & organic results.0 -
Latent Semantic Indexing and Direct Match Domains
I wondered if anyone had any opinions as to whether LSI plays any part in the ranking of a direct match domain? For example :- would www.search-engine-optimisation.com be more likely to rank better for search terms such as 'SEO Services' or 'SEO Experts' than www.some-random-domain.com Does having 'Search Engine Optimisation' in the domain name mean that you would rank better for 'SEO'?
Intermediate & Advanced SEO | | AdeLewis0 -
Your Biggest SEO Jobs Explained
What have been your most proudest moments in SEO ? What would you consider as being you best achieved job and in what industry was it achieved it. What are the reasons you have this down as your biggest achievement ?
Intermediate & Advanced SEO | | onlinemediadirect0