How to make second site in same niche and do white hat SEO
-
Hello,
As much as we would like, there's a possibility that our site will never recover from it's Google penalties.
Our team has decided to launch a new site in the same niche.
What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.)
We won't have duplicate content, but it's hard to make the sites not similar.
Thanks
-
I'm sorry to hear that I would recommend requesting the people linking to your existing site that are using high quality powerful links to update the back link to point to your new site.
the advantages of dealing with people with legitimate sites are they are much easier to find and will actually help you with these types of things. It's not the nightmare that it is trying to get a hold of a blackhat webmaster.
Outside of creating a 100% legitimate website with a slightly different niche may be content, inbound marketing whatever buzzword you want to use for the very short time I hope it takes you to get your most powerful white hat links to point to your new website.
Removeem.com It is a wonderful tool for finding the names and contact info of webmasters you can use it to make a polite request saying that you have a new domain and you would appreciate if they would please update the link pointing at your site.
After you have taken the best Backlinks away from your existing site I would move to your new site.
I would also be upfront about moving place text saying you are changing domain names in a conspicuous location on your site.
If you feel that your livelihood is being jeopardized by this I definitely can understand I would then really put 110% into creating some top-notch content and user friendly/mobile design on your new brand. When you go live you want to really have something better than what you had before.
I'm sorry I don't know any methods that would be instant but I would consider using pay per click to soften the blow.
I hope this is of help,
Thomas
-
Tom,
I appreciate the responses and they make sense. I don't see a solution. I don't see our current site ever pulling out of penalty no matter what I do and we've got an income off of it.
Any ideas?
-
this is older but
http://googlewebmastercentral.blogspot.com/2010/11/best-practices-for-running-multiple.html
https://www.webmasterworld.com/google/4557285.htm
and this discussion of tactics used to do what are considered now black hat
http://www.nichepursuits.com/should-you-host-all-your-niche-sites-on-the-same-hosting-account/
it is no ok in Google ad words either
sorry for all the posts,
Tom
-
with all that said I think if you go after a slightly new niche or offer things from a different angle you're obviously doing twice the work.
Are you concerned that if you 301 redirect you will be bringing the penalty over?
sincerely,
Tom
-
talking about taking the new site and building it using white hat tactics that were implemented after the penalty in which the original site has yet to return from. I know that creating sites that are essentially going to be the same but contain unique content just to get better rankings is against the rules.
if you remove the first site after building the first site using white hat methods currently employed on the existing site
( I should say domain because that's what's coming down to right?)
it would be in your best interest to remove the first site when the second website goes live.
I know this is not the ideal situation because you probably have some good Backlinks on the original but having two sites that are competing for the same niche owned by the same person/company would be competing for the same place in the SERPS I believe would be considered a method of rigging the system.
if you have one site that is completely fine if you have one that is going to go after different niche that is completely fine.
I am basing this on an e-commerce client of mine who had competitor selling the exact same product with unique content across three domains.
The client reported this to Google and the spam team acted or there was an incredible coincidence because two months later sites reported could not be found in Google's index.
I that is of help,
Tom will
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much value does these have on SEO?
how much value doesn't having an address a phone numbe an https rather than http i appreciate any help.
White Hat / Black Hat SEO | | BobAnderson0 -
On-site duplication working - not penalised - any ideas?
I've noticed a website that has been set up with many virtually identical pages. For example many of them have the same content (minimal text, three video clips) and only the town name varies. Surely this is something that Google would be against? However the site is consistently ranking near the top of Google page 1, e.g. http://www.maxcurd.co.uk/magician-guildford.html for "magician Guildford", http://www.maxcurd.co.uk/magician-ascot.html for "magician Ascot" and so on (even when searching without localisation or personalisation). For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen. I guess there must be some other reason that this site is ranked highly - any ideas? The content is massively duplicated and the blog hasn't been updated since 2012 but it is ranking above many established older sites that have lots of varied content, good quality backlinks and regular updates. Thanks.
White Hat / Black Hat SEO | | MagicianUK0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
How can I tell if my site was penalized from the most recent penguin update?
Hey all, I want to be able to see if my website was penalized from the most recent penguin update because we have several hundred websites built and at the bottom of each on it says something along the lines Website by, Web Design by, Hosting by and links back to our homepage. Could this possibly be penalizing us since these links have similar anchor text and on sites that have nothing to do with our services? Thanks, Ryan
White Hat / Black Hat SEO | | MonsterWeb280 -
Can anyone tell me why this site ranks so well?
Site in question: cellphoneshop.net From what I can tell from their link profile, the links they garner don't appear to be particularly high value but they dominate organic listings for my vertical (cell phone accessories), esp. in the last 2-3 months when Google was supposedly increasing the quality of their search results. Can anyone tell me why in particular this site ranks so well for competitive short and long tail terms?
White Hat / Black Hat SEO | | eugeneku0 -
Is our sub-domain messing up our seo for our root?
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain. The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain. How do these errors affect the root domain, and how do you propose we address the issue?
White Hat / Black Hat SEO | | opusbyseo0 -
Can Using Google Analytics Make You More Prone to Deindexation?
Hi, I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case. Thanks for your thoughts, Donna
White Hat / Black Hat SEO | | WebMarketingHUB0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0