Do we need to worry about external redirects?
-
Hi all,
We always avoid internal redirects. Just wonder what if many of the out going links are redirecting to new links. I presume there is nothing wrong to host such links. Any ideas?
Thanks
-
Hi,
For ease of use, I always try to advise that any link is updated if it is clear that it has been redirected, but unless the subject has changed, it really shouldn't make any difference.
If you have a link from your site to an article about breeding caterpillars and the target site then change this to something about feeding habits of polar bears, this is a link I would change.
Top and bottom line, if the link makes sense, you will be fine.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Do I need to track my rankings on the keywords "dog" and "dogs" separately? Or does Google group them together?
I'm creating an SEO content plan for my website, for simplicity's sake lets say it is about dogs. Keeping SEO in mind, I want to strategically phrase my content and monitor my SERP rankings for each of my strategic keywords. I'm only given 150 keywords to track in Moz, do I need to treat singular and plural keywords separately? When I tried to find estimated monthly searches in Google's keyword planner, it is grouping together "dog" and "dogs" under "dogs"... and similarly "dog company" and "dog companies" under "dog companies". But when I use Moz to track my rankings for these keywords, they are separate and my rankings vary between the plural version and singular version of these words. Do I need to track and treat these keywords separately? Or are they grouped together for SEO's sake?
Algorithm Updates | | Fairstone0 -
A web audit for web traffic? Need answers please..
Hi, We are a PR agency based in Dubai and we produce a lot of web content. The website is build on ruby on rails and we have implemented keywords and SEO strategies but sadly the traffic pattern has not changed since the past three years. What surprised us today that we created a page 2-3 days ago for a client who is participating in Arab Health (a very prestigious healthcare event) and suddenly our page comes on top 3 on google.ae as well as google.com We are kind of convinced that there is something wrong with our code.. Do you think this could be a possibility? and the lack of change in the traffic pattern might not be an SEO issue but a code issue? What could be the possible reasons for this pattern? In such a scenario what would experts like you recommend we do? Do a SEO Audit? Web audit? code audit? hire a seo/ web / code consultant? Thanks - helpful answers are really appreciated and just btw if anyone feels they could professionally help us out of this mess, we are willing to work with him/her. Thanks in advance
Algorithm Updates | | LaythDajani0 -
301 redirects
Hi, we have an old site hosted by company A. We rank for certain terms in google for certain brands and products. Now we have developed a new website on a new domain hosted by Company B. If we are 301'ing at brand/product/page level from old to new, who is it that should perform this job? Is it Company A or B, old or new? And does the physical website need to remain hosted for the 301 to work and for our SEO ranks on the old site to not fall apart? Company A think we can do an excel mapping doc for each link from old site to new. Hand file to Company A and they host this file (not the actual website) then we transfer old domain to Company A as well. Then the 301s will work fine. Yet Company B think we should continue hosting with Company A, keep the old physical site live and put the 301s in place. They say if the 301 link has content behind it then it will help or not take the chance of having the SEO affected? Who is right? Do you need the old website to remain live once 301s in place or can this 301 config file hosted on a domain be all we need to do? Any other ideas welcomed. Thanks
Algorithm Updates | | YNWA0 -
Does it impact over ranking of any website if their same content being used some other external sources
Hi Moz & members, I just want to make sure over website www.1st-care.org , does it impact over ranking this website if the same content (of about us or home care services) being used some other external sources or local citations places. Do those published same content create any ranking drop issue with this website's and making its content strengthen week? . As I was on 9th position in Google.com before, now it has slipped to 29th position. WHY? is there content issue or anything else which i am not aware.
Algorithm Updates | | Futura
See the content used:
Home page content
About us page content Regards,
Teginder Ravi0 -
301 Redirects?
Hello fello Mozzers, I have just read a post about 301 redirects on the Blog. A great read and has provided me with a bit more insight and highlights what could be a potential issue for a managed site I look after. On this website I manage, I have inherited a .htaccess file with literally hundreds of non file based existant 301 links. e.g. redirect 301 /dealerbrandname http://www.domain.com/ So we have lots of dealers and they place a link on there site to http://www.domain.com/dealerbrandname We then redirect it to the homepage or a relevant topic page along with some tracking variables. Is this likely causing significant issues, based on the post I read I imagine it will be, but anymore thoughts on this would be hugely helpful. CheersTim
Algorithm Updates | | TimHolmes0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Need some Real Insight into our SEO Issue and Content Generation
We have our site www.practo.com We have our blog as blog.practo.com We plan to have our main site in a months time from now as www.ray.practo.com The Issues - I will then need to direct all my existing traffic from www.practo.com to www.ray.practo.com Keeping in mind SEO and also since I will be generating new content via our Wordpress instance what are the best ways to do this so that google does not have difficulty in find out content 1. Would it be good if I put the Wordpress instance as ray.practo.com/ blog(wordpress instance comes in here in the directory) / article-url 2.Would it be better with www.practo.com / ray / blog/article-url I am using wordpress to roll out all our new SEO based content on various keywords and topics for which we want traffice - primary reasons are since we needed a content generation cms platform so that we dont have to deal with html pages and every time publish those content pages via a developer. Is the above - what soever I am planning to do in the correct manner keeping SEO in mind. Any suggestions are welcome. I seriously need to know writing seo based content on wordpress instance and have them in the urls is that a good idea? Or is only html a good idea. But we need some cms to be there so that content writers can write content independently. Please guide accordingly. Thanks
Algorithm Updates | | shanky10