Is it a good idea to 301 redirect one same niche site towards another site for seo benefit
-
Hello friends, I have 2 android niche sites, one site is running on a technology dropped domain i catch 1 year ago it has, almost 400+ domains linking to different parts of the site, the other one i established from scratch and both are running from jan 2015. Now i want to redirect first site which already has 400 links pointing towards it to the home page of my 2nd android site.
Is it a good idea to do so and does it give any boost in terms of seo?
-
I agree with Gaston, but usually the SEO benefits that you will see are rather small. This was a tactic that was used a couple of years ago to just buy a bunch of domains that already have a ton of links to them and then just point them to your main root domain.
Usually the easiest signal to look at is already to see if you suddenly start redirecting certain URLs to another page.
-
Hi Pervaiz,
The redirection 301 that you're planning to do might improve your SEO.
On one hand, you must be certain that those 400 links that are in the old domain have some quality and are good for your main site.
I do recommend you to check them over somt platmforms, like OSE or Ahrefs.On the other hand, in my experience making redirections like that were a pain. Remember that the momento that you set the redirect 301, all the links will be pointing to you URL, that makes a ton of new links to the final URL. I may be something that causes harm to your linkbuilding history.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Canonical redirect?
Can a canonical URL redirect? I'm doing country specific urls with the www. redirecting to the country (i.e. if you go to www.domain.com you'll redirect to fr.domain.com in france). If the canonical is www. then all the spiders will go to the correct place but I don't know if search engines recommend against a canonical that redirects.
Algorithm Updates | | mattdinbrooklyn0 -
What is the impact of HTTP/2 on SEO ?
I think it's good for the user experience and speeds up websites, especially if your site has a lot of requests. But i'm not sure if there are other side effects, and if there's an impact on SEO or technical configuration. Most of my websites are built with Wordpress, some with Joomla.
Algorithm Updates | | Croco_Web_Solutions1 -
Why does my site dissappeare from the top 50?
Hellow I am having some problems with my site www.kondomanija.si. It was ranked on the first page for my main KW kondomi (in www.google.si, Slovenia) but now it is not in the top 10 pages. And this has happened before, it drops out of the top 10 pages and in a cople of moths it is back for a short time (till it drops out again). It think the site has a week link profile... Could this be the reason? Does anybody know what is going on?
Algorithm Updates | | Spletnafuzija0 -
If the homepage is sandboxed for a keyword is the whole site sandboxed for that keyword?
If the homepage of a website has been sandboxed for certain keywords does this mean that the whole site is sandboxed for them keywords or just the homepage? If a new sub-page was created with quality unique content, would it be possible to get that sub-page ranked for the same keywords that have been sandboxed on the homepage? I have asked many other SEO professionals this same question and nobody really knows for sure. Do you?
Algorithm Updates | | Mark A Preston0 -
How to write a good resourceful SEO enabled article
We have our saas based website - most of our online customers are those who keep coming back to us and my GA is full of their footprints. I completely want to concentrate on getting hold of those who might really need our software and as of now are not able to find them . Including keywords through which people might want to find us is one of the ways. Next how do I publish that to the majority of the users to find and get traction better on that article or post? Would posting links to facebook twitter etc and getting people to find those articles there and link back and come on our main website to read it - will this help? We sell cloud based software but have various domains where our customers can make use of it. There are at least 5-10 of them. We don't have content at all on our website. In a few simple steps how can I get started with this - Content generation **Linking back the content ** Generating good foot falls from users to those cotent Notching up on google for those content page A detailed insight would prove much helpful Thanks
Algorithm Updates | | shanky11 -
Benefits of Breadcrumbs Statistics
Hi all, Whilst I know why breadcrumbs are recommended for a site in terms of usability and helping define the structure of a site to search engines, are there any statistics to support the implementation of the benefits of implementing them? Therefore what I'm after is statistics that state that there was an increase in traffic or rankings from the implementation of breadcrumbs on the site and in terms of usability was there an increase in conversions (can be sales, emails, subscriptions, etc) or metrics such as average pages viewed or time spent on site? Any assistance would be much appreciated. Thanks, James
Algorithm Updates | | bigpond0