Cross-linking domains dominate SERP?
-
Hi,
I have been doing some keyword research and noticed two domains properly linking back to each other for almost every piece of content. I thought this was not working any longer but it looks like it works for them. For many competitive keywords, they rank in top 10, and even for some keywords, they rank #1 and #2. PA and DA not more than 36-38. With 3-4 linking root domains, these pages manage to rank in top 10.
And the second strategy they have, is to create alternative text to rank for a number of different long-tail-keywords. Seperate pages targeting seperate keywords and the only difference between them is slightly modified text and images.
Third is possibly the best, their second domain is an exact match domain name for most keywords linked to this industry. On some SERP's, they have 8-10 results in top 30.
SEMRUSH shows %500 growth for both of these domains.
So, I guess I should just sit and admire them.
-
It all makes sense, sir. And I have to admit you are spot on with the way they have distracted me. Thanks for taking the time to share your thoughts.
-
I wasn't implying that he should make a network of sites, i meant the links he gets should be better. - My bad.
-
....do it even better.
Right... as a gamer you know the power of the superior weapon.
-
Just to add to that, These websites have shown you their strategy, instead of admiring them, replicate what they have done, but do it even better.
Original content like EGOL suggests, and even more relevant and stronger links, no doubt you'll be a strong competitor.
Greg
-
I have been doing some keyword research and noticed two domains properly linking back to each other for almost every piece of content.
I would not worry about this being too effective. It is only one unique linking root.
I thought this was not working any longer but it looks like it works for them.
It seems to be distracting competitors
And the second strategy they have, is to create alternative text to rank for a number of different long-tail-keywords. Seperate pages targeting seperate keywords and the only difference between them is slightly modified text and images.
If you create unique content for all of those KWs that should put you in the position of advantage.
Third is possibly the best, their second domain is an exact match domain name for most keywords linked to this industry. On some SERP's, they have 8-10 results in top 30.
This tells me that they have a lot of content for that KW and that there are probably not a lot of strong competitors.... but really, what is more important - the number you have in the top thirty or if you are in the top three?
So, I guess I should just sit and admire them.
You can do that if you want. From what you have shared they don't sound too hard to beat. I would go after them with a single site with lots of unique, high-quality content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
What link would be better?
Hi Guys, Just wondering what would be better in this instance: finding an old post (with good authority) and getting a link from that old article or creating a brand new article and adding the link to that. Finding an old post (with good authority) and getting a link from that old article Creating a brand new article and adding the link to that. Both naturally link out to the page you want a link too. To me, number 1 as the page already has authority but then again number 2 since Google might place some weight to recency. Any thoughts? Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Google displaying SERP link in Japanese
A chamber of commerce site near me has Google displaying their link in japanese characters when they search for them by name. If you google Eastern monmouth chamber of commerce, you will see this. The site is emacc.org. Can anyone tell me what might cause this or how to resolve?
Intermediate & Advanced SEO | | jeremyskillings0 -
Used Alternate Domain for Print Material But not Showing up in SERPs
Used an alternate domain name to send print advertising traffic to so we could measure effectiveness. The domain does not have any content on it and only forwards to our actual domain. The issue is neither domain shows up when people search the alternate domain name without .com. The question is, will putting up content under that domain, including meta data, and submitting the site to Google as well as running fetch as Google make the alternate domain name to show up... it's an extremely unique domain name and there's basically zero competition for it.
Intermediate & Advanced SEO | | Dom4410 -
Site wide footer links vs. single link for websites we design
I’ve been running a web design business for the past 5 years, 90% or more of the websites we build have a “web design by” link in the footer which links back to us using just our brand name or the full “web design by brand name” anchor text. I’m fully aware that site-wide footer links arent doing me much good in terms of SEO, but what Im curious to know is could they be hurting me? More specifically I’m wondering if I should do anything about the existing links or change my ways for all new projects, currently we’re still rolling them out with the site-wide footer links. I know that all other things being equal (1 link from 10 domains > 10 links from 1 domain) but is (1 link from 10 domains > 100 links from 10 domains)? I’ve got a lot of branded anchor text, which balances out my exact match and partial match keyword anchors from other link building nicely. Another thing to consider is that we host many of our clients which means there are quite a few on the same server with a shared IP. Should I? 1.) Go back into as many of the sites as I can and remove the link from all pages except the home page or a decent PA sub page- keeping a single link from the domain. 2.) Leave all the old stuff alone but start using the single link method on new sites. 3.) Scratch the site credit and just insert an exact-match anchor link in the body of the home page and hide with with CSS like my top competitor seems to be doing quite successfully. (kidding of course.... but my competitor really is doing this.)
Intermediate & Advanced SEO | | nbeske0 -
Linking to bad sites
Hi, I just have a quick question. Is it very negative to link to "bad" sites, such as online pharmacies, dating, adult sites, that sort of stuff? How much does linking to a "bad" site negatively affect a "good" site? Thank you.
Intermediate & Advanced SEO | | salvyy0 -
Too many links?
I've recently taken over a site from another agency, which has hundreds of linking root domains. These domains are of very low quality and, in my opinion, are being ignored by Google. Is it best to 'clean up' some of these links, or leave them and start building quality links? I just don't want to waste time cleaning link profiles if there's no need.
Intermediate & Advanced SEO | | A_Q0