Ratio of linking C-blocks to Linking domains
-
Hi,
Our linkbuilding efforts have resulted in acquiring a high number of backlinks from domains within a C-block.
We all know Google issues penalties whenever someone's link profile looks unnatural. A high number of backlinks but a low number of linking C-blocks would seem to be one of reasons to get penalized.
Example: we have 6,000 links from 200 linking root domains coming in from 100 C-blocks.
At what point should we start to worry about being penalized/giving off an unnatural look to mr G?
-
I think you're overthinking the issue. The question is not the C blocks, but how it relates to your site. Are these links relevant? Are they using natural link text? Are you geolocated?
Remember that IPs are a technical thing, and while they count for some things, it's only one signal among many factors with your link profile.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt blocking Addon Domains
I have this site as my primary domain: http://www.libertyresourcedirectory.com/ I don't want to give spiders access to the site at all so I tried to do a simple Disallow: / in the robots.txt. As a test I tried to crawl it with Screaming Frog afterwards and it didn't do anything. (Excellent.) However, there's a problem. In GWT, I got an alert that Google couldn't crawl ANY of my sites because of robots.txt issues. Changing the robots.txt on my primary domain, changed it for ALL my addon domains. (Ex. http://ethanglover.biz/ ) From a directory point of view, this makes sense, from a spider point of view, it doesn't. As a solution, I changed the robots.txt file back and added a robots meta tag to the primary domain. (noindex, nofollow). But this doesn't seem to be having any effect. As I understand it, the robots.txt takes priority. How can I separate all this out to allow domains to have different rules? I've tried uploading a separate robots.txt to the addon domain folders, but it's completely ignored. Even going to ethanglover.biz/robots.txt gave me the primary domain version of the file. (SERIOUSLY! I've tested this 100 times in many ways.) Has anyone experienced this? Am I in the twilight zone? Any known fixes? Thanks. Proof I'm not crazy in attached video. robotstxt_addon_domain.mp4
Technical SEO | | eglove0 -
I have a sub domain that has live content on it but the root domain redirects to another URL. I know this is not great but what are the implications?
I have a subdomain that is populated and has content. The root domain that the sub lives on redirects to an entirely different URL. I am trying to make a case as to why this isn't great besides the fact that it is just weird user experiences. What are the SEO implications etc. Would any equity that gets built up on the subdomain get passed along in the redirect? Or will there be indexation issues with Google? Cheers, Mark
Technical SEO | | mjsikorsky0 -
Blocking Test Pages Enmasse on Sub-domain
Hello, We have thousands of test pages on a sub-domain of our site. Unfortunately at some point, these pages were visible to search engines and got indexed. Subsequently, we made a change to the robots.txt file for the test sub-domain. Gradually, over a period of a few weeks, the impressions and clicks as reported by Google Webmaster Tools fell off for the test. sub-domain. We are not able to implement the no index tag in the head section of the pages given the limitations of our CMS. Would blocking off Google bot via the firewall enmasse for all the test pages have any negative consequences for the main domain that houses the real live content for our sites (which we would like to of course remain in the Google index). Many thanks
Technical SEO | | CeeC-Blogger0 -
301 redirect domain to page on another domain
Hi, If I wanted to do a 301 permanent redirect on a domain to a page on another domain will this cause any problems? Lets say I have 4 domains (all indexed with content), I decide to create a new domain with 4 pages, one for each domain. I copy the content from the old domains to the relevant page on the new domain and set it live. At the same time as setting the new site live I do a 301 permanent redirect on the 4 domains to the relevant pages on the new domain. What happens if Google indexes the new site before visiting the redirected domains, could this cause a duplicate content penalty? Cheers
Technical SEO | | activitysuper0 -
IPs and Domains
If a domain loads on the domain and the IP is that a problem? So it loads on domain.com and 69.16.....com Thanks!
Technical SEO | | tylerfraser0 -
Can I do a redirect to a new domain name only a couple of weeks after having redirected to another domain?
I have a client with two website with very similar content. Both had a lot of inbound links and performed fairly well in SERPS. We recently combined both sites and have redirected one of the domains to the other. The traffic dipped slightly initially, but is recovering nicely. Now the client registered a new domain name he would like to use for the site. Should I wait a few weeks for everything to settle down after the first redirect/consolidation of sites before doing a new redirect to a new domain name, or should I not worry about having any issues with doing it right away?
Technical SEO | | Drewco0 -
How much effect does number of outbound links have on link juice?
I am interested in your thoughts on the effect of number of outbound links (obls) on link juice passed? ie If a page linking to you has a high number of obls, how do you compute the effect of these obls and relative negative effect on linkjuice. In the event that there are three sites on which you have been offered the opportunity of a link Site A PA 30 DA50 Obls on page 10 Site B PA 40 DA50 Obls on page 15 Site C PA 50 DA50 Obls on page 20 How would you appraise each of these prospective page links (ignoring anchor text, relevancy, etc which will be constant) Is there a rule of thumb on how to compare the linkjuice passed from a site relative to its PA and the number of obls? Is it as simple as page with 10 obls passes 10x juice of page with 100 obls?
Technical SEO | | seanmccauley0 -
Is this seen as a Link Exchange
If i give a self serve banner ad to someone on my blog or a image with a link and they give me a text link ad is that in googles eyes a link exchange or a one way link.
Technical SEO | | DavidKonigsberg0