"Unnatural links to your site" manual action by Google
-
Hi,
My site has been hit by a "Unnatural links to your site" manual action penalty and I've just received a decline on my 2nd reconsideration request, after disavowing even more links than I did in the first request. I went over all the links in WMT to my site with an SEO specialist and we both thought things have been resolved but apparently they weren't.
I'd appreciate any help on this so as to lift the penalty and get my site back to its former rankings, it has ranked well before and the timing couldn't have been worse.
Thanks,
Yael -
Yes. It will often take me 3-6 weeks to do a thorough job on a manual penalty. I can do it faster if I dedicate all my time to it, but yeah...it's time consuming.
If you don't get example links it usually means that you have a large number of unnatural links still not addressed.
-
Thanks Marie for your input and advice. I didn't get any examples from Google despite asking for them twice. As you've suggested I'll create a spreadsheet with the list of domains, contacts etc. It's tricky to understand which domain needs to be taken down and which is valid, I don't want to make mistakes and dig a deeper hole for my site if and when it comes out of penalty.
I did get a sitewide manual action so I just hope to get it resolved as quickly as possible. Obviously contacting dozens or hundreds of sites would take some time to complete.
-
I'm working now to get as much information as I can to understand and cope with the issues that caused the penalty. I'm sure I can get the best advice here on Moz. Which site link auditing services would you recommend?
-
When you failed on your first two requests, did Google give you any example links? Those usually hold the key to why you are not passing.
Also, when you get a manual action it is vitally important to make attempts to try to remove links and not just disavow them. If you have links that can't get removed, then you need to show some sort of effort. I usually include a Google doc spreadsheet with the domains and the contact info and notes on how many attempts I have made at contact. Sometimes, if I have a site where I can't get any links removed I'll make a comment as to why. But usually, there are some that can still be removed. For example, you can report spam domains to Blogger or Weebly and they'll probably remove them.
It may be a good idea to have someone else review your links as well to see if there are more that could be removed/disavowed. Sometimes it is obvious which links are unnatural, and sometimes it is not.
I'd appreciate any help on this so as to lift the penalty and get my site back to its former rankings, it has ranked well before and the timing couldn't have been worse.
If you have a sitewide manual action then yes, when your penalty is removed you should see a good return in rankings for brand terms. But, if it is a partial match then you may find that not a lot changes unfortunately. I wrote an article on Moz about this which you can read here: https://moz.com/blog/after-penalty-removed-will-traffic-increase. Sometimes with a partial action I'll see some improvement, but sadly it is usually not dramatic. With that said, if your site has a really good base of truly naturally earned links then you have a good chance to see good improvement.
Hope that helps!
Marie
-
"Your seo specialist" may have got you into the pickle... have you also obtained independent advice and run a deep site link audit?
-
Hi Ishai,
There are a few steps I typically run through in this instance to get the issue resolved.
Firstly, rather than just submitting a disavow file, spend some time actively trying to remove as many links as you can without paying for them. Fixing a penalty isn't as simple as submitting a text file and Google wants to see that you're actively trying to fix the problem before they will lift the penalty.
It's often said they don't read the comments in your disvow file but I always add these in anyway. I mention what I've done to resolve the issue (contacted all possible low-quality sites requesting the links be removed) and even having a separate section for the particularly dodgy sites that want me to pay for removal.
Being able to demonstrate that you're legitimately trying to fix the mistake rather than waving the magic Disavow wand goes a long way to them removing your penalty.
Another tip that you may or may not be aware of - always disavow at the domain level rather than individual links. This way, if some of the dodgy directories shuffle their site structure and link to you from a different page, the links are still disavowed.
The syntax for this is simple: domain:badwebsite.com
This info is all covered in Google's Search Console Help section
EDIT: I should also mention, just pulling Link to Your Site from Search Console isn't going to give you a very comprehensive list. Consider combining this list with an export from Ahrefs or Moz's Open Site Explorer as this will give you a better idea of exactly what sites are linking to you.
Frustratingly, Search Console only seems to show a selection of referring domains.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Strange site link on Google for a Facebook result
A Facebook page targetted to US Hispanics (with content in Spanish and English) is showing me a hindi sitelink underneath the main Facebook link when I google (in the US, English) for the page [ page name facebook]. We don't have any content in hindi, or targetted to that audience. If I click on the sitelink while logged out of facebook, I can see it takes me to a facebook subdomain of hi-in. When I'm logged in it just redirects me to the same page. Any idea why this could be happening?
Intermediate & Advanced SEO | | M_80 -
301 redirecting a site that currently links to the target site
I have a personal blog that has a good amount of back links pointing at it from high quality relevant authoritative sites in my niche. I also run a company in the same niche. I link to a page on the company site from the personal blog article that has bunch of relevant links pointing at it (as it's highly relevant to the content on the personal blog). Overview: Relevant personal blog post has a bunch of relevant external links pointing at it (completely organic). Relevant personal blog post then links (externally) to relevant company site page and is helping that page rank. Question: If I do the work to 301 the personal blog to the company site, and then link internally from the blog page to the other relevant company page, will this kill that back link or will the internal link help as much as the current external link does currently? **For clarity: ** External sites => External blog => External link to company page VS External sites => External blog 301 => Blog page (now on company blog) => Internal link to target page I would love to hear from anyone that has performed this in the past 🙂
Intermediate & Advanced SEO | | Keyword_NotProvided0 -
Articles marked with "This site may be hacked," but I have no security issues in the search console. What do I do?
There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP. I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place. The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored. What other steps should I take? One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/
Intermediate & Advanced SEO | | Liggins0 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
Why isn't google indexing our site?
Hi, We have majorly redesigned our site. Is is not a big site it is a SaaS site so has the typical structure, Landing, Features, Pricing, Sign Up, Contact Us etc... The main part of the site is after login so out of google's reach. Since the new release a month ago, google has indexed some pages, mainly the blog, which is brand new, it has reindexed a few of the original pages I am guessing this as if I click cached on a site: search it shows the new site. All new pages (of which there are 2) are totally missed. One is HTTP and one HTTPS, does HTTPS make a difference. I have submitted the site via webmaster tools and it says "URL and linked pages submitted to index" but a site: search doesn't bring all the pages? What is going on here please? What are we missing? We just want google to recognise the old site has gone and ALL the new site is here ready and waiting for it. Thanks Andrew
Intermediate & Advanced SEO | | Studio330 -
Rel="prev" and rel="next" implementation
Hi there since I've started using semoz I have a problem with duplicate content so I have implemented on all the pages with pagination rel="prev" and rel="next" in order to reduce the number of errors but i do something wrong and now I can't figure out what it is. the main page url is : alegesanatos.ro/ingrediente/ and for the other pages : alegesanatos.ro/ingrediente/p2/ - for page 2 alegesanatos.ro/ingrediente/p3/ - for page 3 and so on. We've implemented rel="prev" and rel="next" according to google webmaster guidelines without adding canonical tag or base link in the header section and we still get duplicate meta title error messages for this pages. Do you think there is a problem because we create another url for each page instead of adding parameters (?page=2 or ?page=3 ) to the main url alegesanatos.ro/ingrediente?page=2 thanks
Intermediate & Advanced SEO | | dan_panait0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0