Dealing with manual penalty...
-
I'm in the back-and-forth with Google's Quality Search team at the moment. We discovered a manual penalty on our website and have been trying to get it removed as of late. Problem is, tons of spammy incoming links.
We did not ask for or purchase any of these links, it just so happens that spammy websites are linking to our site. Regardless, I've done my best to remove quite a few links in the past week or so, responding to the Quality Search team with a spreadsheet of the links in question and the action taken on each link.
No luck so far.
I've heard that if I send an email to a website asking for a link removal, I should share that with Google as well. I may try that.
Some of the links are posted on websites with no contact info. A WhoIs search brings up a hidden registrant. Removing these links is far from easy.
My question is, what are some techniques that are proven to be effective when working your way through the removal of a manual penalty? I know Google isn't going to tell me all of the offending links (they've offered a few examples, we've had those removed - still penalized) so what's the best way for me to find them myself? And, when I have a link removed, it may stay in Webmaster Tools as an active link for a while even though it no longer exists. Does the Quality Search team use Webmaster Tools to check or do they use something else?
It's an open-ended question, really. Any help dealing with a manual penalty and what you have done to get that penalty removed is of great help to me. Thanks!
-
Ryan Kent has some experience with this, and shared it in this Q&A at http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is putting a manufacturer's product manual on my site in PDF duplicate content
I add the product manuals to our product pages to provide additional product information to our customers. Is this considered duplicate content? Is there a best way to do this so that I can offer the information to my customers without getting penalized for it? Should they be indexable? If not how do I control?
Technical SEO | | merch_zzounds0 -
Building a new website post penalty and redirects
A website I'm working on is clearly algorithmically penalised. I've spent a lot of time mass disavowing spammy links, but it doesn't seem to make a difference. We have been planning to build a new website anyway since we are rebranding. 1. Is it possible to tell which pages are most likely to have a penalty applied? 2. If the website as a whole has a penalty, will redirecting certain pages to the new website carry the penalty? 3. Our website is structured as sales pages and blog content. It is the sales pages that have the spammy links, yet most of the blog content does not rank either. Would it be a good strategy to only redirect all the blog posts (which have natural links pointing to them) to the new website and not the sales pages? 4. The homepage has a mix of spam and very good editorial links. If I have disavowed links and domains, can I safely redirect this page?
Technical SEO | | designquotes0 -
Webmaster Tools Manual Actions - Should I Disavow Spammy Links??
My website has a manual action against it in webmaster tools stating; Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole I have checked the link profile of my site and there are over 4,000 spammy links from one particular website which I am guessing this manual action refers to. There is no way that I will be able to get these links removed so should I be using Google's Disavow Tool or is there no need? Any ideas would be appreciated!!
Technical SEO | | Pete40 -
Dealing with closely related pages
I have a book with 8 pages which I offer free on my site: http://www.pottytrainingchart4kids.com/free-potty-training-book/ For technical reasons each of the 8 pages are on a seperate page. This might cause thin content/duplicate content since most of the code is the same besides the images and there isn't much on each page. How would you suggest I deal with this? I remember once reading about rel prev or something like that but I am not sure if it is applicable. I would like all page rank to go to the main page. Should I add no index to the other pages? I am not really sure what I should do to prevent a Panda penalty. Thanks in advance!
Technical SEO | | JillB20130 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
How can you manually diagnose the canonical problem
Good Monrning from snow dusted minus 3 degrees C Wetherby UK... Is there a quick way to diagnose wether or not a website has a canonical problem or not? So far Ive been doing this for example: Typing a full web address then one without the w's and seeing if a 301 redirect has been set up. But I'm not confident this is the best way to diagnose if there is a canonical problem with a site. I would like to ad that I want to see if a canonical problem exists with any site and webmanster tools is not available. Any insights welcome 🙂
Technical SEO | | Nightwing1 -
Dealing with PDFs?
Hello fellow mozzers! One of our clients does an excellent job of providing excellent content, and we don't even have to nag them about it (imagine that!). This content is usually centered around industry reports, financial analyses, and economic forcasts; however, they always post them in the form of pdfs. How does Google view PDF's, and is there a way to optimize them? Ideally, I am going to try to get this client set up with a blog-like plateform that will use HTML text, rather than PDF's, but I wanted to see what info was out there for PDF's. Thanks!
Technical SEO | | tqinet0