302 redirects - redirecting numerous domains into main primary domain
-
302 Redirects - We are a digital agency carrying out some SEO analysis for a potential client. The client has bought over 150 different domains and redirected (302) them into his main domain. The domains were bought up based on relevant industry keywords and protection.
On first instance this seems like a Black hat technique that Google would most definitely punish - buying up domains and redirecting them to main website.
Does anyone have any thoughts on this?
Thanks...
-
Hi Sean,
If these are just domains they bought that never had any content, there's nothing to worry about here. Lots of brands buy their .net/.co/.etc versions, spelling variations, and any branded/product terms to prevent squatters from moving in. Redirects of any sort are fine in this case.
If they're buying companies along with their domains, also don't worry about it. In fact, I'd say use a 301 redirect in those cases. Google does a decent job of understanding formal purchases and looking for official proof, and they're not going to penalize someone who redirects the site of a company they've purchased to their own.
Finally, if they're buying domains based on the link profile, i/.e. just for the sake of links, then you need to start worrying. It's less problematic with 302 redirects, but I would recommend against this type of domain buying altogether. Some people use expired domains as a method of indirect link buying.
Here's a still-relevant piece from Danny:
http://searchengineland.com/do-links-from-expired-domains-count-with-google-17811
-
Thanks for for your feedback guys! It's greatly appreciated
As I mentioned on first instance this strategy screamed Black Hat at us. We decided to carry out some research in to 302 redirects to see if anything cropped up. The general consensus was that 302 redirects were not harmful for SEO and did not pass any link juice. So this planted a tiny seed of doubt in our minds to completely categorizing this as black hat. So I came to the SEOmoz community to get some concrete answers and you guys confirmed our initial thoughts.
At the moment based on your feedback I think we wil recommend culling the irrelevant domains and redirecting a couple of the relevant domains to an affiliate site and then redirecting that site to the main site (just like you suggested Shledon). I completely agree with you when you say relevancy is key.
Any more thoughts on the issue are more than welcome..
Thanks again
-
I'd say you covered all the bases, Brad. I don't know what you could have done to protect yourself any more. Sean, I think your client is playing a dangerous game. My advice would be to first cull any of those additional sites that aren't highly relevant to their own. Write them off as a bad investment. One thought that occurs to me is that rather than setting a 302 from all the remaining sites to the client's site, maybe you could redirect them to a selected site from their recent acquisitions. Then redirect THAT site to the client's site. Using 302s is still safer, IMO, while you go through the process of requesting changes on link destinations. Relevance is obviously the key... stretching that is treading on thin ice. My approach to that is, if it needs any statement of justification.... it's not justifiable.
-
This is pretty dangerous business. Not sure what they spent on all those domains but I could quickly see Google stripping out all the value if this isn't handled properly.
I recently acquired a small competitor. There were good business reasons for the acquisition but we still wanted to tread carefully with the new domain. Here is what we are doing.
The domain was actually a website that we did not want to maintain so we 301 redirected all the urls up to the homepage and then placed an announcement on that page of our acquisition. The announcement is an image that clicks through to our website. We intentionally did not include any anchor text. Next we issued a press release of the acquisition. The press release is a good line in the stand in case Google did anything crazy to us. We would be able to point back to the date and let them know this was a business move. Next we started reaching out to all the backlinks and making a friendly request to move their links from the previous name to ours. In our eyes any site that moves it to us is a long term win because the risk of the value being stripped out goes way down. Next we sent an email to the customer base informing them of the acquisition and a discount code for trying out our services. Finally, once the outreach to change links is done and the smoke clears (3,6, 12 months down the road) we will place the 301 redirect on the domain to our site.
This is the only way I would suggest buying domains and redirecting. Buying domains for search purposes is blackhat, period. Buying competitors or other sites that help your business but also could help you in search is not. We have decided to take a safer approach to maximize value and mitigate risk.
-
I'd agree that it's a bad idea, particularly at that scale. If relevance of the redirected domains is high, and it's just a couple of domains, I imagine it wouldn't be a problem. The fact that they're using a 302 will afford them some protection, but how long are they planning to leave that "temporary" redirect in place?
-
Im in favor of buying domains that are close to your brand and redirecting, but buying over 100+ domains to redirect isn't smart. Unless they have a strategy built around populating unique content around them - its not a good move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
Does google sandbox aged domains too?
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34 Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years. So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
White Hat / Black Hat SEO | | Steven231 -
Multiple redirects for GA tracking
We recently replaced a high traffic online service with a new one that now resides at a new URL. We redirect the old site (https://subdomain.mysite.org) to a static page announcing the change (http://www.mysite.org/announcement.html) that links out to the new online service. The SSL cert on the old site is valid for two more months and then would cost $1K to renew. We'd like to measure traffic from the old link over the next two months to see if it's worth renewing the SSL cert to keep a redirect going. If I go into GA, filter the "announcement.html" page and set the secondary dimension to "referral path" I'm not seeing any traffic from https://subdomain.mysite.org. Guessing this is part of the "(not set)" group. First thought was to have that go to a unique intermediary page to log the referral, which then redirects out to the announcement page. Is this considered spammy or is there another way to track referrals from the https site that I'm not considering? Thanks.
White Hat / Black Hat SEO | | c2g0 -
New Domain Name or Keep going - Help not Recovering after Penguin
Hi Moz Friends I wonder if you can help me , a while ago we had a Penguin Penalty and lost our Rankings. After Months of work Disavow and Reconsiderations , Google sent me a message in Webmaster Tools to confirm the Penalty had been uplifted. Since then we havent recovered. I have been working with Bloggers to build relevant safe links, each having a DA of between 10-30. We have developed a Mobile Friendly Website and ios and Android Apps. We have improved Site Speed and moved to a Server within the same Country. We add lots of content and believe we have ticked all the boxes for onpage optimisation. However our DA and PA seems to have dropped slightly after Moz update today. We seem to be jumping in the serps, one day page 4 for "fancy dress" the next day nowhere to be found. I'm not sure what to do next. I'm not expecting to jump back to page 1 for the main keywords but some positive movement would be nice, especially as there are Lower DA Website, not mobile friendly or as fast above us in the serps. What I am looking for I guess is any ideas from you and also what you think about this idea A few people have mentioned that we might stand more of a chance using our domain name example.com instead of example.co.uk. example.com has never been used and is totaly clean (no penaltys ect..) Do we use example.com and move the website and content away from example.co.uk ? if so do we use redirects or would that just pass any hold thats on example.co.uk to the .com version Ideas Welcome Thanks Adam
White Hat / Black Hat SEO | | AMG1000 -
How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites. The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms. For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website... technology website- artciel about the engineering of xyz product
White Hat / Black Hat SEO | | MNoisy
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly used Currently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare? If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google? Thank you in advance! I look forward to some informative answers.0 -
Pages linked with Spam been 301 redirected to 404\. Is it ok
Pl suggest, some pages having some spam links pointed to those pages are been redirected to 404 error page (through 301 redirect) - as removing them manually was not possible due to part of core component of cms and many other coding issue, the only way as advised by developer was making 301 redirect to 404 page. Does by redirecting these pages to 404 page using 301 redirect, will nullify all negative or spam links pointing to them and eventually will remove the resulting spam impact on the site too. Many Thanks
White Hat / Black Hat SEO | | Modi0 -
Should I redirect old pages
I have taken over SEO on a site. The old people built thousands of pages with duplicate content. I am converting site to wordpress and was wondering if I should take the time to 301 redirect all 10,000 or so pages with duplicate content. The 10,000 pages all have links back to different to different pages as well as to the homepage. Should I just let them go to a 404 page not found.
White Hat / Black Hat SEO | | Roots70 -
Use of 301 redirects
Scenario Dynamic page produces great results for the user but produces a long very un-user and un-search friendly URL http://www.OURSITE.co.uk/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=loving&x=0&y=0#/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=lovingthebead&rh=i%3Aaps%2Ck%3Alovingthebead Solution 301 redirect in .htaccess Fantastic - works a treat BUT after redirect the original long ugly old URL appears in the location field Would really like this showing the new short user friendly short URL What am I doing wrong? Thank you all. CB
White Hat / Black Hat SEO | | GeezerG0