How to find artificial or unnatural links in OSE?
-
Hi,
I just got a message from Google Webmaster Tools telling that there are "artificial or unnatural links" pointing to one of my subdomains, and that I should investigate and submit my site for reconsideration.
The subdomain in question has inbound links from 4K linking root domains. We are a certificate authority (we provide SSL certificates) so the majority of those links come from the site seal that customers place on their secure pages.
We sell certificates to a full spectrum site types, from all sizes of ecommerce sites to .edu, .gov, and even adult. That said, our linking root domains have always been a mixed bunch, which tells me that these offending links were recently added.
Here are my questions:
- Is it possible to slice my link reports with some sort of time element, so that I can narrow the search to only the newest inbound links?
- How else might I use OSE to find these "artificial or unnatural links"? Are there any particular attributes I should be looking for in a linking root domain that might suggest it's seen by Google as "artificial or unnatural".
Any help with any aspect of this issue would be greatly appreciated.
Thanks,
Dennis
p.s. I should probably state that I've never bought links or participated in link schemes.
-
Are there any particular attributes I should be looking for in a linking root domain that might suggest it's seen by Google as "artificial or unnatural".
The first item to check would be your site seal. How exactly is the backlink created for the seal? I would presume it is an image link from the seal. Be sure there is nothing which could cause a penalty in your site seal. If you sincerely have not purchased links or performed any shady tactics, a coding issue with the seal is a likely cause of the warning.
A prime commonality to look for in OSE is the anchor text. Adjust the 4 filters at the top of OSE as follows: followed + 301d, only external links, pages on this subdomain, group by domain. Perform the search on the particular subdomain which received the warning. These settings will reduce the list to the links which are most meaningful to you, and in this case the ones which could reasonably cause an issue with Google.
Next, download the CSV file from the OSE link, then sort by the "anchor text" field. If there is any anchor text used repeatedly, investigate the links. There should be a natural variation in the anchor text such as "SSL", "SSL cert", "SSL certs", "SSL certificates", "purchase a SSL certificate", etc. If a high percentage of links all use the exact phrase, it may trigger a flag.
Once you complete your research, take any corrective actions necessary then report back to Google with the results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"One Page With Two Links To Same Page; We Counted The First Link" Is this true?
I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/ pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
Technical SEO | | PaddyDisplays
Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)" then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant. I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages. I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?0 -
Best practice for multiple domain links
A site i'm working on has about 12 language domains - .es, it, .de etc. On each page of every domain the header has links to every homepage. At the moment these are all set to no-follow as an initial step to stop potential link profile issues spreading around. Moving forward i'm not totally sure how to handle these links. On one side I see and agree that no-follow is not necessary, but do-follow is just filtering out and weakening link juice. What is the best way to handle this scenario?
Technical SEO | | MickEdwards0 -
I cannot find a way to implement to the 2 Link method as shown in this post: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218
Did Google stop offering the 2 link method of verification for Authorship? See this post below: http://searchengineland.com/the-definitive-guide-to-google-authorship-markup-123218 And see this: http://www.seomoz.org/blog/using-passive-link-building-to-build-links-with-no-budget In both articles the authors talk about how to set up Authorship snippets for posts on blogs where they have no bio page and no email verification just by linking directly from the content to their Google+ profile and then by linking the from the the Google+ profile page (in the Contributor to section) to the blog home page. But this does not work no matter how many ways I trie it. Did Google stop offering this method?
Technical SEO | | jeff.interactive0 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
Added data to links
Hello I am in the process of cleaning a site and getting less pages cached. it is a magento site and I was wondering what is your advice fo pages that get this padded to the link ?material=139&price=10%2C12 accept the obvious canonical? thanks
Technical SEO | | ciznerguy0 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
Metrics to determine the quality of a link?
I found this very useful post on SEOMoz http://www.seomoz.org/blog/525600-metrics-how-do-you-measure-measure-a-link, but its a bit dated. Also, it doesn't really help in terms of applying exact metrics to measure the quality of a link. Does anyone have any other suggestions to help automate / determine the quality of a link?
Technical SEO | | nicole.healthline0 -
External link optimization
The company I work for sells software online. We have deals learning institutes that allow their students to use our software for next to nothing. These learning institutes, which are usually quite strong domains, link to our sign in area. Nice way to get powerful links hey… or is it? There are a couple of problems with these links: They all link to a subdomain (signin.domain.com) The URLs also contain unique identifiers (so that we know which institute they are coming from). Meaning they all link to different signin URLs. (eg. signin.domain.com/qwerty, signin.domain.com/qwerta, signin.domain.com/qwerts, etc. ) So all these links aren't as effective as they could be (or at all?). In a perfect SEO world these links would all point to the start page, however, due to the fact that our start page is of a commercial set up this would run the risk of communicating the wrong idea to the institutes and their students. So… are there any extremely brilliant pro mozzers that have a savvy idea how set this up in a more SEO friendly way? Thanks in advance!
Technical SEO | | henners0