Will a Google manual action affect all new links, too?
-
I have had a Google manual action (Unnatural links to your site; affects: all) that was spurred on by a PRWeb press release where publishers took it upon themselves to remove the embedded "nofollow" tags on links. I have been spending the past few weeks cleaning things up and have submitted a second pass at a reconsideration request. In the meantime, I have been creating new content, boosting social activity, guest blogging and working with other publishers to generate more natural inbound links.
My question is this: knowing that this manual action affects "all," are the new links that I am building being negatively tainted as well? When the penalty is lifted, will they regain their strength? Is there any hope of my rankings improving while the penalty is in effect?
-
Hi Maria
What do you mean by "Low quality directory made just for the purpose of gaining a link" -- Is there an issue with linking back from directories to your site?
Does this apply to submitting my website to social bookmark websites using a specific anchor text that am optimizing for?
Thanks
James
-
Hi Michael,
"You are correct that it wasn't a single press release but 3-4 that all had the same circumstances."
It's quite unlikely that a few press releases are the sole cause of your penalty, although it is possible. But I think you may have more links to clean up. Here are two examples:
http://healthmad.com/health/when-las-vegas-gets-the-best-of-you-6-ways-to-get-back-on-your-feet-in-sin-city/ - self made article
http://www.cannylink.com/healthhospitaldirectories.htm - Low quality directory made just for the purpose of gaining a link
-
Interesting. I hadn't seen these links before and have never purchased links. I'll download the list from open site explorer and review and disavow these and similar. Thanks for pointing these out!
-
Agreed. Ordinarily it wouldn't matter, but once subject to manual review they would be.
-
Looking in Open Site Explorer, I'm seeing several suspicious links in the report. These are links from sites that have nothing to do with medicine whatsoever, all with targeted keywords in the anchor text. When I click to view the page and look for the actual links, I'm not seeing anything. So, it seems the links are no longer there.
If the report from Open Site Explorer is correct, it looks to me like someone was purchasing links and has now removed them. Did you purchase links?
Some of the suspicious links are:
- afghan-network.net/Bookshop/persian-books.html
- learnscratch.org/resources/why-learn-scratch
- www.tiltshift.com/
If these links are also in Google's link profile, I could see why the site is penalized.
-
I suspect you missed some and Google are being well... Google.
Ahrefs do a 7 day money back guarantee. You can even find a 50% off coupon around for the first month. Some people will even need to check majestic as well.
No one site will get all the links unfortunately.
-
Agreed. But given that I had those removed in quick order and it has been several weeks since they have considerably dropped, any reason why they wouldn't have removed the manual action. I am essentially back to a pre-PRWeb profile.
-
Just looking a bit more, but you could have been flagged for manual checking because from around the beginning of August you had a huge spike of links. Based on Matt Cutts previous statements about Prweb, they would have seen it as possibly spammy.
From August you went up to nearly 125 referring domains, before dropping back down to 36 now. Prior to PRweb, you were at around 30 referring domains. I suspect this spike is what caused a manual review.
-
I don't know if that makes me feel better or not, but you basically confirmed my thoughts. I may do what you indicate and disavow everything, but I am going try one more time and cut a lot more deeply in actual link removal first.
Meanwhile, of course, I am top 5 for all my major terms in Bing and Yahoo. Joy!
Thanks
-
I have to say on my first quick look I cannot determine why you would have got a manual penalty. Your link profile does not look spammy, and I wonder if google are specifically targeting sites that use PRweb.
With Ahref's I only see 77 dofollow backlinks, and to be honest you could probably be very brutal when it comes to dissavowing these links and starting again.
It is strange that the two methods of link building (prweb, and infographics) are two methods that Matt Cutts has recently (in the last few months) said that should be nofollow links.
But I cannot give anything definitive based on what I am seeing.
-
I disavowed in the same day I submitted a reconsideration request, but I did also include it in my documentation. I also included multiple emails to publishers and contact form submissions, as recommended to me.
-
Sure. http://www.urgentcarelocations.com
I just added the footer links to each state profile this week and see how those could be considered "spammy." They weren't supposed to be implemented with "urgent care" after every one of them. I doubt that is an issue here, however, given that they keep referring to unnatural links.
-
Sometimes it takes a little while for the disavow tool to remove links. So, you may need to give it some time if you just did that. You can always include the disavow request in the documents for your reconsideration request. Beyond that, I'd take a closer look at your other links to see if there are other links causing an issue.
-
Thinking about it Kurt, I have to agree that it is odd that a manual penalty has arisen from this. Michael, if you would like to share a link to your site, perhaps we can have a look and see if there is something obvious happening.
-
I have disavowed the URLs now. The major offender was streetinsider.com. I was able to remove URLs on two other offending publisher sites. Even with the disavow, however, Google didn't remove the manual action. Going to try out removeem.com to see if their tools/service can assist.
-
Bummer about the rejection. You said that you were having trouble getting the press releases removed (and I assume the links), have you disavowed those links?
-
Thanks Kurt. You are correct that it wasn't a single press release but 3-4 that all had the same circumstances. In fact, it was the same 2-3 publishers that removed the nofollow tags. The real crummy thing is that those publishers refuse to remove the links so I am having to resort to disavowing them.
While I have been working through a couple of reconsideration requests, I have built some pretty strong links, but Google seems to have capped me at page 5.
I actually got a negative response back from Google this morning following my latest reconsideration request. It provided no specifics as it did in the past only that my "Site violates Google's quality guidelines" and references the manual action of "Unnatural links to your site." I'm on round three now. I only have about 300 total inbound links nearly all of which are purely natural or nofollow. What a mess...
-
That stinks that those publishers did that. I'm a little suspicious that this would happen from a single press release. Usually, it takes Google a bit more than that to trip a manual action. Are you sure there aren't other links, maybe other press releases, that are suspicious? I only ask because Google usually responds to a pattern of manipulation, not a single action.
In regards to your actual question, natural links typically aren't "tainted" by a previous penalty. In fact it would probably work in the exact opposite way. With manual actions Google thinks that you are trying to manipulate them. In order to get the manual action removed, Google is looking for you to clean up the old links, apologize, and demonstrate that you have changed your ways. So, getting new links that are completely natural demonstrates that you have changed your ways.
Kurt Steinbrueck
OurChurch.Com -
Yes. The publisher (streetinsider.com, amongst others) are technically violating PRWeb's copyright terms as they are altering the content prior to publishing. PRWeb isn't very happy, but has been unsuccessful at getting the articles removed (which isn't helping my reconsideration request).
-
Ditto. I saw a competitor use PRweb, and was tempted. However, I felt the potential for spammy links not the direction I wanted my SEO to go in.
This just reinforces the issue.
-
manual action ... that was spurred on by a PRWeb press release where publishers took it upon themselves to remove the embedded "nofollow" tags on links.
Seriously? I've been thinking about trying PRWeb for product announcements but this makes me rethink that strategy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is alternating what link it likes to rank on wordpress site and
Hi there, I'm experiencing a problem where google is pick and choosing different links structures to rank my Wordpress site for my main keywords. The site had pretty good #1 rankings for a long time but recently I noticed Google is choosing to rank the page in one of two ways. Let me just say that the original way where it held good rankings looked like this for example: flowers.com/the-most-beautiful-wedding-bouquets/ this is just an example it' is not my site. And when google decides to switch it up it uses this link structure:flowers.com > weddings (this still points to this link flowers.com/the-most-beautiful-wedding-bouquets when I hover my mouse over it) however this link structure that never appeared before and now does, usually has much lower rankings. Please note it's not both link structures being ranked at the same time for the keywords. It's one or the other that google is currently alternating in ranking and I believe it's hurting the sites position.
Intermediate & Advanced SEO | | z8YX9F80
I'm not sure if this is a wordpress settings thats gone wrong or what the problem is but I do know when shows the expanded and descriptive link structure flowers.com/the-most-beautiful-wedding-bouquets the rankings are higher and in 2nd place. I'm hoping by rectifying this I can regain back my position. I'm very grateful for any insight you could offer on why this is happening and how I could fix it. Thank you. PS Wordpress site has several SEO plugins0 -
Links: Links come from bizzare pages
Hi all, My question is related to links that I saw in Google Search Console. While looking at who is linking to my site, I saw that GSC has some links that are coming from third party websites but these third party webpages are not indexed and not even put up by their owners. It looks like the owner never created these pages, these pages are not indexed (when you do a site: search in Google) but the URL of these pages loads content in the browser. Example - www.samplesite1.com/fakefolder/fakeurl what exactly is this thing? To mention more details, the third party website in question is a Wordpress website and I guess is probably hijacked. But how does one even get these types pages/URLs up and running on someone else's website and then link out to other websites. I am concerned as the content that I am getting link from is adult content and I will have to do some link cleansing soon.
Intermediate & Advanced SEO | | Malika10 -
Multiple sitewide (deep)links devalued by Google?
In my experience sitewide links can still be very powerful if used sensibly and in moderation. However, I'm finding that sitewide text blocks with 2 or 3 (deep)links to a single domain appear not to be working that well or not at all in raising the authority of those target pages. Anyone having the same experience? In your experience, is the link value diminished when there are multiple deeplinks to a single domain in a sitewide text area? Is anything more than 1 link per target domain bad? Or could it even be that it's not so much the number of deeplinks to a single domain that matter, but purely the fact that they are sitewide "deeplinks"? Are sitewide deeplinks treated differently than sitewide links linking to an external homepage? Very interested in hearing your personal experience on this matter. Factual experience would be best, but "gut feeling" experience is also appreciated 🙂 Best regards, Joost
Intermediate & Advanced SEO | | JoostvanVught0 -
Does google detect all updated page with new links
as paid links? Example: A PR 4 page updates the page a year later with new links. Does Google discredit these links as being fishy?
Intermediate & Advanced SEO | | imageworks-2612900 -
Will Google Revisit a 403 Page
Hi, We've got some pretty strict anti-scraping logic in our website, and it seems we accidentally snared a Googlebot with it. About 100 URL requests were responded to with a 403 Forbidden error. The logic has since been updated, so this should not happen again. I was just wondering if/when Googlebot will come back and try those URLs again. They are linked from other pages on the site, and they are also in our sitemap. Thanks in advance for any assistance.
Intermediate & Advanced SEO | | dbuckles0 -
How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew? Shawn
Intermediate & Advanced SEO | | Romancing0 -
Sitewide blog link and Article links
Hi Guys I just wanted to give you all a heads up on something I adjusted recently that worked really well and wanted to ask for your own experiences on this. 1. We have a blog that adds regular content and within the blog we link from the keyword we are targeting. Standard stuff right ! We were struggling for movement on a keyword so I removed the links from the articles and added a link on the site wide blogroll. The link on the blogroll included the keyword but was a longer descriptive link. Low and behold we got a first page listing when the changed it.The change in ranking was made a few days later. I have always been given the impression that site wide isn't that great ? So explain this one . Of course there are many other factors etc 🙂 What are your experiences and thoughts on what happened here ?
Intermediate & Advanced SEO | | onlinemediadirect0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0