"Unnatural links to your site" manual action by Google
-
Hi,
My site has been hit by a "Unnatural links to your site" manual action penalty and I've just received a decline on my 2nd reconsideration request, after disavowing even more links than I did in the first request. I went over all the links in WMT to my site with an SEO specialist and we both thought things have been resolved but apparently they weren't.
I'd appreciate any help on this so as to lift the penalty and get my site back to its former rankings, it has ranked well before and the timing couldn't have been worse.
Thanks,
Yael -
Yes. It will often take me 3-6 weeks to do a thorough job on a manual penalty. I can do it faster if I dedicate all my time to it, but yeah...it's time consuming.
If you don't get example links it usually means that you have a large number of unnatural links still not addressed.
-
Thanks Marie for your input and advice. I didn't get any examples from Google despite asking for them twice. As you've suggested I'll create a spreadsheet with the list of domains, contacts etc. It's tricky to understand which domain needs to be taken down and which is valid, I don't want to make mistakes and dig a deeper hole for my site if and when it comes out of penalty.
I did get a sitewide manual action so I just hope to get it resolved as quickly as possible. Obviously contacting dozens or hundreds of sites would take some time to complete.
-
I'm working now to get as much information as I can to understand and cope with the issues that caused the penalty. I'm sure I can get the best advice here on Moz. Which site link auditing services would you recommend?
-
When you failed on your first two requests, did Google give you any example links? Those usually hold the key to why you are not passing.
Also, when you get a manual action it is vitally important to make attempts to try to remove links and not just disavow them. If you have links that can't get removed, then you need to show some sort of effort. I usually include a Google doc spreadsheet with the domains and the contact info and notes on how many attempts I have made at contact. Sometimes, if I have a site where I can't get any links removed I'll make a comment as to why. But usually, there are some that can still be removed. For example, you can report spam domains to Blogger or Weebly and they'll probably remove them.
It may be a good idea to have someone else review your links as well to see if there are more that could be removed/disavowed. Sometimes it is obvious which links are unnatural, and sometimes it is not.
I'd appreciate any help on this so as to lift the penalty and get my site back to its former rankings, it has ranked well before and the timing couldn't have been worse.
If you have a sitewide manual action then yes, when your penalty is removed you should see a good return in rankings for brand terms. But, if it is a partial match then you may find that not a lot changes unfortunately. I wrote an article on Moz about this which you can read here: https://moz.com/blog/after-penalty-removed-will-traffic-increase. Sometimes with a partial action I'll see some improvement, but sadly it is usually not dramatic. With that said, if your site has a really good base of truly naturally earned links then you have a good chance to see good improvement.
Hope that helps!
Marie
-
"Your seo specialist" may have got you into the pickle... have you also obtained independent advice and run a deep site link audit?
-
Hi Ishai,
There are a few steps I typically run through in this instance to get the issue resolved.
Firstly, rather than just submitting a disavow file, spend some time actively trying to remove as many links as you can without paying for them. Fixing a penalty isn't as simple as submitting a text file and Google wants to see that you're actively trying to fix the problem before they will lift the penalty.
It's often said they don't read the comments in your disvow file but I always add these in anyway. I mention what I've done to resolve the issue (contacted all possible low-quality sites requesting the links be removed) and even having a separate section for the particularly dodgy sites that want me to pay for removal.
Being able to demonstrate that you're legitimately trying to fix the mistake rather than waving the magic Disavow wand goes a long way to them removing your penalty.
Another tip that you may or may not be aware of - always disavow at the domain level rather than individual links. This way, if some of the dodgy directories shuffle their site structure and link to you from a different page, the links are still disavowed.
The syntax for this is simple: domain:badwebsite.com
This info is all covered in Google's Search Console Help section
EDIT: I should also mention, just pulling Link to Your Site from Search Console isn't going to give you a very comprehensive list. Consider combining this list with an export from Ahrefs or Moz's Open Site Explorer as this will give you a better idea of exactly what sites are linking to you.
Frustratingly, Search Console only seems to show a selection of referring domains.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Does including your site in Google News (and Google) Alerts helps with SEO?
Based on the following article http://homebusiness.about.com/od/yourbusinesswebsite/a/google-alerts.htm in order to check if you are included you need to run site:domain.com and click the news search tab. If you are not there then... I ran the test on MOZ and got no results which surprised me. Next step according to :https://support.google.com/news/publisher/answer/40787?hl=en#ts=3179198 is to submit your site for inclusion. Should I? Will it help? P.S.
Intermediate & Advanced SEO | | BeytzNet
This is a followup question to the following: http://moz.com/community/q/what-makes-a-site-appear-in-google-alerts-and-does-it-mean-anything0 -
Link from Google.com
Hi guys I've just seen a website get a link from Google's Webmaster Snippet testing tool. Basically, they've linked to a results page for their own website test. Here's an example of what this would look like for a result on my website. http://www.google.com/webmasters/tools/richsnippets?q=https%3A%2F%2Fwww.impression.co.uk There's a meta nofollow, but I just wondered what everyone's take is on Trust, etc, passing down? (Don't worry, I'm not encouraging people to go out spamming links to results pages!) Looking forward to some interesting responses!
Intermediate & Advanced SEO | | tomcraig860 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Will Google penalize a site that had many links pointing to it with utm codes?
I want to track conversions using utm parameters from guest blog posts on sites other than my own site. Will Google penalize my site for having a bunch of external articles pointing to one page with unique anchor text but utm code? e.g. mysite.com/seo-text?utm_campaign=guest-blogs
Intermediate & Advanced SEO | | wepayinc0 -
How Google treat internal links with rel="nofollow"?
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner. "Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)." It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search. But, what about internal links? I have defined rel="nofollow" attribute on too many internal links. I have archive blog post of Randfish with same subject. I read following question over there. Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007] A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
Intermediate & Advanced SEO | | CommercePundit
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.) Matt has given excellent answer on following question. [In 2011] Q: Should internal links use rel="nofollow"? A:Matt said: "I don't know how to make it more concrete than that." I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all. For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course. I am still using nofollow attributes on my website. So, what is current trend? Will it require to use nofollow attribute for internal pages?0 -
Link anchor text: only useful for pages linked to directly or distributed across site?
As a SEO I understand that link anchor text for the focus keyword on the page linked to is very important, but I have a question which I can not find the answer to in any books or blogs, namely: does inbound anchor text 'carry over' to other pages in your site, like linkjuice? For instance, if I have a homepage focusing on keyword X and a subpage (with internal links to it) focusing on keyword Y. Does is then help to link to the homepage with keyword Y anchor texts? Will this keyword thematically 'flow through' the internal link structure and help the subpage's ranking? In a broader sense: will a diverse link anchor text profile to your homepage help all other pages in your domain rank thematically? Or is link anchor text just useful for the direct page that is linked to? All views and experiences are welcome! Kind regards, Joost van Vught
Intermediate & Advanced SEO | | JoostvanVught0