Creating multiple domains with key phrases and linking back and forth to them
-
There are several of my competitors who have built multiple sites with keywords in their domain names such as localaustinplumber.com, houstonplumbers.com, Dallasplumbers.com, localdallasplumbingservices.com...you get the picture. (These are just made up examples to illustrate what they are doing) They put unique content on each page and use alias whois using a different credit card to set up each domain to hide the fact from Google that they are the same entity and then link back and forth to each of the domains with appropriate keywords in the anchor text. They are outranking me on a lot of key search phrases due to the fact that they have the keywords in the domain name. They have no other outside links other than the links from the domains that they own. Is this a good idea? is it black hat? are they going to get slapped if someone reports them as a link farm? It's frustrating for me staying white hat and getting legitimate links and then these competitors come in and out rank me after only a few months with this scheme. Is this a common practice to rank highly for certain key phrases?
Thanks in advance for your opinions!
Ron10
-
Thank you!
-
My Linkscape knowledge comes mostly from a great Help desk response I received from Aaron regarding a question I submitted.
I do not otherwise know the answer to your question.
Aaron Wheeler said:
both Open Site Explorer and the Web App Link Analysis are based on our Linkscape Index of the web. I'm sorry that you haven't been able to see your links in Linkscape. Most new sites and links will be indexed by our spiders and available in Linkscape and Open Site Explorer within 60 days, but some take even longer for a plethora of reasons, including crawl-ability of sites, the amount of inbound links to them, and the depth of pages in subdirectories. Just so you know, here's how we do our index: we take the last index, take the 10 billion URLs with the highest mozrank (with a fixed limit on some of the larger domains), and start crawling from the top-down until we've crawled 40,000,000,000 pages (which is about 1/4 of the amount in Google's index). Therefore, if the site is not linked to by one of these seed URLs (or one of the URLs linked to by them in the next update) then it won't show up in our index
We update our Linkscape Index every 3 to 5 weeks. Crawling the whole internet to look for links takes 2-3 weeks. And then we've got 1-2 weeks of processing to do on those links to determine which are the most important links etc. You can see a schedule of how often we update, and planned updates here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
Linkscape focuses on a breadth-first approach, and thus we nearly always have content from the homepage of websites, externally linked-to pages and pages higher up in a site's information hierarchy. However, deep pages that are buried beneath many layers of navigation are sometimes missed and it may be several index updates before we catch all of these.
If our crawlers or data sources are blocked from reaching those URLs, they may not be included in our index (though links that points to those pages will still be available). Finally, the URLs seen by Linkscape must be linked-to by other documents on the web or our index will not include them.
For now, the best thing you can do to help your domain become indexed is to work on link building for links from sites with high mozrank.
-
Ryan, do you know if redirected links are visible in Linkscape? I am guessing that they are not.
-
They can effectively rank better then other sites which do not use proper SEO practices.
This is a very valid point... and at the same time it is good news if you know more effective methods.
-
Linkscape (SEOmoz tool for crawling links) only includes the top ?30% of web pages and their links. You are not likely to see links to local area plumber sites.
-
Ron,
The tactics your competitors are effective. They can effectively rank better then other sites which do not use proper SEO practices. If your site offers fantastic content combined with solid SEO, then you can blow your competitors sites off the front page of Google.
Your choices are:
-
hire an SEO
-
start with the Beginners Guide to SEO and incorporate every learning into your site
-
join your competitors on trying to work around the system until one of your competitors steps ups and leaves everyone in the dust
-
-
They have no other outside links other than the links from the domains that they own. Are you sure about this?
Yes, or at least there are no other links that the mozbar seomoz link analysis shows, but they don't show all links. Many times it will show no links to a page and then you look up links through the Yahoo links report and will find 20 or 30 links that the mozbar didn't show. Does the mozbar not crawl as deeply as the Yahoo spider does?
-
It's frustrating for me staying white hat and getting legitimate links and then these competitors come in and out rank me after only a few months with this scheme.
You should be praising God that they are wasting their time with all of this domain buying, credit card charging, alias creating, linking back and forth bullcrap. When they start doing something really effective you are in big trouble.
They have no other outside links other than the links from the domains that they own.
Are you sure about this? If they have no links from outside of their own network of sites then links from these domains should be close to zero value. And, if they do have these links from outside of their own niche they would be more effectively used directed to a smaller number of sites - perhaps only their main site.
Is this a common practice to rank highly for certain key phrases?
Its a common practice... but there are better methods of ranking highly.
Is this a good idea?
I think its a good idea to sell to your competitor.
(I know that a lot of people are going to disagree with me on this.... that's OK.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple redirects for GA tracking
We recently replaced a high traffic online service with a new one that now resides at a new URL. We redirect the old site (https://subdomain.mysite.org) to a static page announcing the change (http://www.mysite.org/announcement.html) that links out to the new online service. The SSL cert on the old site is valid for two more months and then would cost $1K to renew. We'd like to measure traffic from the old link over the next two months to see if it's worth renewing the SSL cert to keep a redirect going. If I go into GA, filter the "announcement.html" page and set the secondary dimension to "referral path" I'm not seeing any traffic from https://subdomain.mysite.org. Guessing this is part of the "(not set)" group. First thought was to have that go to a unique intermediary page to log the referral, which then redirects out to the announcement page. Is this considered spammy or is there another way to track referrals from the https site that I'm not considering? Thanks.
White Hat / Black Hat SEO | | c2g0 -
What to do with these toxic links?
Back in July I had posted here that I thought someone was doing negative SEO against us. We monitor our links on a daily basis, and a lot of toxic links came in quickly within a few days. So we were pro-active and ended up disavowing those links soon after we saw them. Shortly after that our ranking start to drop and we lost a good amount of traffic, though I do not know if its really connected since we only disavowed those toxic links and we weren't ranking FROM those links since they were disavowed so quickly. Now, its happening again. 20 new inbound domains linking to us from complete crap websites with crap content and not done by us. I want to disavow them, but I am thinking that maybe the first time we disavowed the links, it hurt us, and maybe disavowing now will hurt us further? I think Google should be able to filter out this crap but who knows, too much depends on this being handled correctly. Here are some of the crappy links: http://optibike.com/?home.php=page/loans/student-loan-without-a-cosigner-2.html
White Hat / Black Hat SEO | | DemiGR
http://designsbynickthegeek.com/?index.php=finance/loans/loan-for-you-3.html
http://www.nuvivaweightloss.com/?index.php=article/loans/300-loan-today.html
http://ecommercesalesmultipliersystem.com/?home.php=board/loans/fast-loan-with-monthly-payments-2.html They are mostly duplicate content across a network of sites. How would you guys handle this?0 -
Can a domain name alone be considered SPAM?
If someone has a domain that is spammy, such as "http://seattlesbestinsurancerates.com" can this cause Google to not index the website? This is not our domain, but a customer of ours has a similar one and it appears to be causing issues! Any thoughts? Thanks for any input!
White Hat / Black Hat SEO | | Tosten0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Exact Match Domains - Why are they still dominating?
Fantastic day! I am seeing exact match domains still dominating. SEOmoz has some insight: http://www.seomoz.org/blog/exact-match-domains-are-far-too-powerful-is-their-time-limited But that's from two years ago. Is Google ever going to target the manipulators that buy up all the exact match domains? One of our partners is getting the itch, and I am running out of explanations on why we don't manipulate. But if these practices are dominating their industry, what to do? I have to get paid to feed the family so just telling the client buh-bye isn't going to work. At least not in this stage of agency building. Their root domain doesn't do much for them, however, you know we well optimize those subdomains and rank those fine. But if my client can just buy an exact match domain, and it will take less SEO work to get it ranked then why not? He has the SEO expert in his back pocket to clean up the mess IF they would even get a penalty or drop in rank. Is all SEO really is find algo hole, manipulate, penalty, fix, find algo hole, manipulate, penalty, fix. Wash. Rinse. Repeat. Please share your experiences and insight! Thanks, Ben
White Hat / Black Hat SEO | | cyberlicious0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0