How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to do a 301, as well as adding re-write rules on Apache
I'm sure this has probably been asked somewhere before... We're implementing a URL re-write rule to convert non dub pages to the www. subdomain and also removing all trailing slashes as part of a basic canonicalisation exercise. The question is, as well as doing the URL rewrites within htaccess, should I also 301 those duplicate pages or does the URL rewrite do the job on it's own? Thanks mozzers.
Intermediate & Advanced SEO | | Ultramod0 -
Index Problem
Hi guys I have a critical problem with google crawler. Its my website : https://1stquest.com I can't create sitemap with online site map creator tools such as XML-simemap.org Fetch as google tools usually mark as partial MOZ crawler test found both HTTP and HTTPS version on site! and google cant index several pages on site. Is problem regards to "unsafe URL"? or something else?
Intermediate & Advanced SEO | | Okesta0 -
Confused About Problems Regarding Adding an SSL
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch. Can someone clarify this for me? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Should I allow a publisher to word-for-word re-publish our article?
A small blog owner has asked if they can word-for-word republish one of our blog articles on their own blog. I'm not sure how to respond. We're don't do any outreach to submit or duplicate our articles throughout the web... so this isn't something being done in mass. And this could be a great signal to Google that somebody else is vouching for the quality of our article, right? However, I'm a bit concerned about word-for-word duplicating. Normally, if somebody is interested in re-publishing, both the re-publisher and our website would get more value out of it if they re-publisher added some form of commentary or extra value to our post when citing it, right? This small blog just started releasing a segment in which they've titled "guest blog Thursday". And given the recent concerns with guest blogging (even though I'm not sure this is the classical sense of guest blogging), I'm even more concerned. Any ideas on how I should respond?
Intermediate & Advanced SEO | | dsbud0 -
Is it a problem to have too many 301 redirects within your site
my website is translated into 10+ languages, but our news articles are often only published in 1-2 languages. Currently, URLs are created in the unpublished news languages that then 301 redirect the user to main news page since the content doesnt exist in that language. Is this implementation okay or is there a preferred method we should be using so that we don't have a large number of pages on the site with redirects? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Looking for re-assurance on this one: Sitemap approach for multi-subdomains
Hi All: Just looking for a bit of "yeah it'll be fine" reassurance on this before we go ahead and implement: We've got a main accommodation listing website under www.* and a separate travel content site using a completely different platform on blog.* (same domain - diffn't sub-domain). We pull in snippets of content from blog.* > www.* using a feed and we have cross-links going both ways, e.g. links to find accommodation in blog articles and links to blog articles from accommodation listings. Look-and-feel wise they're fully integrated. The blog.* site is a tab under the main nav. What i'd like to do is get Google (and others) to view this whole thing as one site - and attribute any SEO benefit of content on blog.* pages to the www.* domain. Make sense? So, done a bit of reading - and here's what i've come up with: Seperate sitemaps for each, both located in the root of www site www.example.com/sitemap-www www.example.com/sitemap-blog robots.txt in root of www site to have single sitemap entry: sitemap : www.example.com/sitemap-www robots.txt in root of blog site to have single sitemap entry: sitemap: www.example.com/sitemap-blog Submit both sitemaps to Webmaster tools. Does this sound reasonable? Any better approaches? Anything I'm missing? All input appreciated!
Intermediate & Advanced SEO | | AABAB0 -
A Blog Structure Dilemma We're Facing...
We're launching a pretty large content program (in the form of a blog) and have a structure issue: Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog Here are the options: 1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers. 2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage. 3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe. User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning. Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2? (correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case) Many thanks for your inputs on this.
Intermediate & Advanced SEO | | SEOPA0 -
I have a duplicate content problem
The website guy that made the website for my business Premier Martial Arts Austin disappeared and didn't set up that www. was to begin each URL, so I now have a duplicate content problem and don't want to be penalized for it. I tried to show in Webmaster tools the preferred setup but can't get it to OK that I'm the website owner. Any idea as what to do?
Intermediate & Advanced SEO | | OhYeahSteve0