How do you explain the problem with several re-directs to a client?
-
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site.
Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc
All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice,
How Would you explain to the client why they shouldn't be using several re-directs?
What can I do to make sure that they keep as much link juice as possible?
-
I have never worked for Google or any other search engine so I want to make it clear the below is my best understanding of how the process works, and I use it to base my actions upon. I feel my understanding is valid but the examples could probably use a bit of work. I am always willing to entertain other ideas.
Crawlers find and explore links. They capture data and record it in a database. That data is then processed by the search engine. If Page is A indexed, the URL will show in SERPs as Page A. If later you 301 redirect Page A to Page B, when the crawler discovers the 301 redirect the search engine will update the URL in SERPS to Page B. With me so far?
Later you decide to 301 redirect Page B to Page C. When the search engine recognizes the redirect (i.e. the crawler discovers it) the URL will once again be updated in SERPs to Site C. Any instances of the Page A or Page B URLs in the search engines database would be displayed as Page C in SERPs.
Despite the search engine's database having the correct URL to display in SERPs, crawlers are not provided this information. As long as link exists and a crawler can find it, the crawler will attempt to follow it, subject to normal factors such as nofollow, crawl budget, etc. If you modify the initial redirect from Page A to Page C, the crawler will detect the new header change and the search engine will update their records accordingly.
The above information was shared with respect to the appearance of the URL in SERPs, but it should be identical for the backlinks as well. Rather then forwarding the backlinks from Page A to Page B, those links would be directly forwarded to Page C.
So instead of it re-directing from A to B then C, we write a new redirect for A to C. Is this better? if so why?
If you modify the existing redirect to go from Page A to Page C, it is better because it is a single redirect. It is better for your servers (less redirects to process), better for users (quicker page loads), better for you (less redirects to manage and less opportunities for something to go wrong) and therefore better for search engines. You are rewarded for this improvement with your link juice flow being stronger.
-
Thanks Ryan,
Great Answer and illustration!
A follow up questions, what happens if you go back and change the old 301 re-directs?
So instead of it re-directing from A to B then C, we write a new redirect for A to C.
Is this better? if so why?
-
Multiple redirects is a really bad idea and should be corrected whenever possible. The consideration I ask clients to understand is how multiple redirects amplify the loss of link juice. The numbers I will use in the below example are simply how I explain it when asked, and I don't have any solid math to back it up. As we all know, the exact process is kept secret.
Redirect #1 = lose 10% link juice
Redirect #2 = 1st link loses 10%, 2nd link loses 10%x2=20%, total 30% loss
Redirect #3 = 1st link loses 10%, 2nd link loses 20%, 3rd link loses 30% = 60% loss
Redirect #4 = 100% loss.
Again the numbers are likely not that dramatic, but it helps get site owners out of the mindset of "well, a 301 loses just a drop of link juice so 3 or 4 redirects doesn't lose much". We know the trust factors for a site rapidly diminish in an amplified manner a few links away from the source. We know PR on a site evaporates almost completely 4 links into a site. Even top PR sites like DMOZ and Yahoo directory have pages not indexed because there is not enough PR passed through their links to pages on their site which are deep. It is logical to think this same concept applies to redirects. It is another form of following links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same server for different client sites?
Hi everyone - I have a question about whether it's OK for us to host several of our client's websites on the same dedicated web server, without this causing problems in SEO. I know the issues with duplicate content etc., but for background - we provide website services to a particular sector (antiques/auctions). All our clients are distinct, and have written their own copy etc., but because they're all in the same sector, their websites will - largely - talk about the same types of things - so the content is not duplicated, but it's similar in topic, I guess. Does anyone feel it would cause a problem if we were to put several (say about 😎 of our client's websites on the same dedicated web server, or would we be better spreading the sites over different shared servers? Come to think about it, if we are spreading those same 8 sites across 4 virtual servers - but all hosted by the same company - presumably Google would know that too? Thanks in advance for your thoughts on this! Nikki
Intermediate & Advanced SEO | | Go-Auction0 -
Having possible problems with rankings due to development website
Hi all, I've got an interesting issue and a bit of a technical challenge for you. It's a bit complicated to explain, but please bear with me. We have a client website (http://clientwebsite.com) which we are having a hard time ranking in the past few months. Main keywords simply don't show up in Top100 searches, even though we are constantly building backlinks through Guest Posts, Citations, Media mentions, Profile links etc. Normally, we use ahrefs to look at the client's website backlinks, but just today we used Majestic to look at the backlink profile and one backlink stood out. This is a backlink from a development server (http://developmentwebsite.com) which redirects to http://clientwebsite.com
Intermediate & Advanced SEO | | zakkyg
The developers who were working on the redesign of the client website, put it up on their server and forgot to delete it.
Also, the content inside the development website is almost identical with the client website. We then checked to see if http://developmentwebsite.com is indexed.
It's not. Although, inside the robots file http://developmentwebsite.com/robots.txt there's:
User-agent: *
Allow: /
The funny (and weird thing) is that http://developmentwebsite.com/ and all development website inner pages are not indexed in Google. But if we go to http://developmentwebsite.com/inner-page, it doesn't redirect to the corresponding http://clientwebsite.com/inner-page, it's the same development website page URL and the pages even have links to the client website, but like I said, none of the pages of the development website are indexed, even though crawlers are allowed in the robots.txt's development website. In your opinion, could this be the reason why we are having a hard time to rank the client website? Second question is:
How do we approach in solving this issue?
Do we simply delete the whole http://developmentwebsite.com with all the inner pages?
Or should we do 301 redirrects on a per-page basis?0 -
Possible problem with new site (GWT no queries/very low index vs. submitted)
Hi everyone, I recently launched a new website for a small business loan company in the Dallas area. The site has been live for roughly a month and a half. I submitted everything to GWT as usual, including my sitemap. I am not sure what's going on with the site, as there is no activity from GWT in the impressions or queries. The submit vs. index is 24/3 (and hasn't moved). Also the queries graph on the overview stops at 3/18/2015... On another note, when I go to Crawl > Sitemaps, it shows that there were pages indexed during the month of march and then on April 3 it drops from 17 to 2 and never increases. Google says there are no errors or issues found, but I feel like there's something wrong. When I do site:, my URLs do pop up which makes me believe there's just a problem with my GWT. With that being said, I'm not happy THINKING there's something wrong. I need to actually know what the problem is. The only thing I can think of that I have done is purchase SSL for the site, but when I search what pages are indexed using www. it shows all the HTTPS URLS, so that would tell me that the site is getting indexed without a problem? Does anyone have a clue as to what might be happening? I will attach some screen shots so that you can get a better idea... KQ2366i D5xBNZf mF7kkgW
Intermediate & Advanced SEO | | jameswesleyhunt0 -
Need some expert help – My Client bought out competitor and now wants to completely duplicate the current site with the same stock & categories using the Competitor brand
I am the SEO consultant for a large online homewares store. This company currently ranks very well in Google. I can PM the domain name if anyone needs however i don't want to post it on this forum. The company has bought out a competitor and plan to use the same warehouse, same products, and same back-end system as the current site, so they want to completely duplicate the current website. Titles, meta descriptions, product descriptions will all be renamed/rewritten/reworded (however keep in mind there are not many ways to reword a 3 piece saucepan set) Pricing will mostly be the same (some difference though), images cannot be renamed, categories cannot be renamed... the structure of the site will be exactly the same... placement etc. (however will have different banners, logo etc.) I personally don't believe the new site will rank, because it will be too similar. Can someone please offer me a 2nd opinion... Thanks
Intermediate & Advanced SEO | | ryanlenton0 -
Website penalised can't find where the problem is. Google went INSANE
Hello, I desperately need a hand here! Firstly I just want to say that I we never infracted google guidelines as far as we know. I have been around in this field for about 6 years and have had success with many websites on the way relying only in natural SEO and was never penalised until now. The problem is that our website www.turbosconto.it is and we have no idea why. (not manual) The web has been online for more than 6 months and it NEVER started to rank. it has about 2 organic visits a day at max. In this time we got several links from good websites which are related to our topic which actually keep sending us about 50 visits a day. Nevertheless our organic visita are still 1 or 2 a day. All the pages seem to be heavily penalised ... when you perform a search for any of our "shops"even including our Url, no entries for the domain appear. A search example: http://www.turbosconto.it zalando What I will expect to find as a result: http://www.turbosconto.it/buono-sconto-zalando The same case repeats for all of the pages for the "shops" we promote. Searching any of the brads + our domain shows no result except from "nike" and "euroclinix" (I see no relationship between these 2) Some days before for these same type of searches it was showing pages from the domain which we blocked by robots months ago, and which go to 404 error instead of our optimised landing pages which cannot be found in the first 50 results. These pages are generated by our rating system... We already send requests to de index all theses pages but they keep appearing for every new page that we create. And the real pages nowhere to be found... Here isan example: http://www.turbosconto.it/shops/codice-promozionale-pimkie/rat
Intermediate & Advanced SEO | | sebastiankoch
You can see how google indexes that for as in this search: site:www.turbosconto.it rate Why on earth will google show a page which is blocked by the robots.txt displaying that the content cannot retrieved because it is blocked by the robots instead of showing pages which are totally SEO Friendly and content rich... All the script from TurboSconto is the same one that we use in our spanish version www.turbocupones.com. With this last one we have awesome results, so it makes things even more weird... Ok apart from those weird issues with the indexation and the robots, why did a research on out backlinks and we where surprised to fin a few bad links that we never asked for. Never the less there are just a few and we have many HIGH QUALITY LINKS, which makes it hard to believe that this could be the reason. Just to be sure we, we used the disavow tool for these links, here are the bad links we submitted 2 days ago: domain: www.drilldown.it #we did not ask for this domain: www.indicizza.net #we did not ask for this domain: urlbook.in #we did not ask for this, moreover is a spammy one http://inpe.br.way2seo.org/domain-list-878 #we did not ask for this, moreover is a spammy one http://shady.nu.gomarathi.com/domain-list-789 #we did not ask for this, moreover is a spammy one http://www.clicdopoclic.it/2013/12/i-migliori-siti-italiani-di-coupon-e.html #we did not ask for this, moreover and is a copy of a post of an other blog http://typo.domain.bi/turbosconto.it I have no clue what can it be, we have no warning messages in the webmaster tools or anything.
For me it looks as if google has a BUG and went crazy on judging our italian website. Or perhaps we are just missing something ??? If anyone could throw some light on this I will be really glad and willing to pay some compensation for the help provided. THANKS A LOT!0 -
Title tags with >70 characters but most important words at start. Is this really a problem?
Is there in fact any kind of negative impact having title tags longer than 70 characters, as long as I place the most important keywords at the start and make sure that title still is compelling when cut somewhere around 70 characters? Are the additional words after the 70 characters limit just ignored? May additional words dillute the strength of the first words or may they even be helpful ? Any experience or any studies you know about impact of longer title tags? Or any statement from google about it?
Intermediate & Advanced SEO | | lcourse0 -
KW density and idiot clients. HELP!!!!
I have a client who insists on using KW1 @ a 3% rate in a 600-word piece, aka 18 references to KW1 in a two page piece. I upped the KW1 count to 18, but in doing so, added 100 words of text, getting the piece to 700 words. Now the client wants 21 KW1 appearances to maintain that 3% density. If I add 3 more KW1's, I'll up the word count again, requiring more KW1's to hit the 3% mark. Any suggestions for solving the never-ending problem of KW density and idiot clients? Thanks in advance. Paul
Intermediate & Advanced SEO | | webwordslinger0 -
When is it worth re-structuring your site?
I recently started working on a site that is 8 years old and the currently URLs/ site structure is not SEO friendly. We are concerned that in re-structuring the site, we may loose our rankings. Has anyone ever completely re-structured their site? Was it worth it?
Intermediate & Advanced SEO | | nicole.healthline0