Do industry partner links violate Google's policies?
-
We're in the process of The Great _Inquisition_piecing together a reconsideration request. In doing so, we reached out to an agency to filter and flag our backlinks as safe, should be no-followed, or should be removed. The problem is, they flagged several of our earned, industry partner links (like those pointing to us, HireAHelper, from 1-800-Pack-Rat and PODS for example) as either should be no-followed or should be removed. I have a hard time believing Google would penalize such a natural source of earned links, but then again, this is our second attempt at a Reconsideration Request, and I want to cover all my bases. What say you Moz community? No-follow? Remove? Leave alone?
-
Hi Daniel,
Whether these links are all okay or should be removed depends on what else the sites link to, and what else they get up to besides linking to you - if they have been picked off for spam tactics (either linking out, inbound links, on-page spam, etc.) then you'd want to avoid having them link to you, even if they are otherwise genuine industry partners. Sadly some legitimate businesses also run less-than-clean websites from time to time. I would ask the agency who provided your link report for an explanation as to why they placed some of these industry partners in a "remove" category - they may have some very good reasons, or they may have mistaken the intent of the links. I would say that even if they are mistaken, both you and the agency need to ask yourselves if there's a chance Google might also mistake these genuine links as manipulative or unnatural. Unfortunately that can happen as well, but if you are filing for reconsideration you can always explain that x, y and z links have arisen due to a mutual respect / partnership that does not carry with it a commercial benefit to either company in direct relation to the link.
Google has been extremely authoritarian over the last few months about links, and there's a possibility that they'd say a partnership link wasn't "natural" because it had commercial intent. Sometimes it's damn hard to figure out exactly what they mean by "natural". It's incredibly frustrating.
However, backing up again to where you're at right now, I would say that you need an explanation and thorough analysis of why genuine links have been flagged. You never know, the agency might have found something that's actually going to save your next reconsideration request.
-
Do they bring traffic? Does that traffic convert?
If yes, then making them nofollow won't actually hurt you very much at all!
It seems counter-intuitive I know, but better to be safe than sorry.
Good luck,
Amelia
-
Getting links from industry partners makes complete sense to me and they can’t be the link that should hurt your rankings unless those partners are themselves going through some kind of penalty! Google ideally should only mark the website who violate Google guidelines.
-
I think industry partner links are fair game. These are also probably your heavy hitters meaning removing them will probably be the most hurtful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
Are All Paid Links and Submissions Bad?
My company was recently approached by a website dedicated to delivering information and insights about our industry. They asked us if we wanted to pay for a "company profile" where they would summarize our company, add a followed link to our site, and promote a giveaway for us. This website is very authoritative and definitely provides helpful use to its audience. How can this website get away with paid submissions like this? Doesn't that go against everything Google preaches? If I were to pay for a profile with them, would I request for a "nofollow" link back to my site?
White Hat / Black Hat SEO | | jampaper1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Is it bad to no follow all External LInks at the same time?
I am working on more than 40 EMDs. They are good quality brand sites but they all are interlinked to each other through footer links, side bar links. (and they dont have much of linking root domains) Now Some of those sites have been renovated with new templates and these new sites has very few external links (links going out to our own sites) but some of these old sites has 100s of external links (all these external links of course link to our own sites). But anyways, we are planning to no follow all those external links (links that are linking to our own sites) slowly to avoid penalty? question is, can it be bad to implement no follow to all those links on those sites at the same time?Will Google see it as something fishy? (I don't think so) Also, Is it good strategy to no follow all of them? (I think it is) What you guys think ?
White Hat / Black Hat SEO | | Personnel_Concept0 -
How do I place the product link on my blog?
I have a shop and also a blog where I explain better the products on the site, such as: how to use, tips, recipes and more. How do I place the product link on my blog? Should I put a link with nofollow? Should not I put link? To put the link anchor text or just put the page URL? Don’t I need to worry about it?
White Hat / Black Hat SEO | | soulmktpro0 -
Linking Profile Gone Bad?!
Recently, I was looking over the linking profile for one of our large clients, and I noticed that a ton of spammy links were appearing. I have never purchase any links or done anything shady that would contribute to this large increase in bad links. It appears as though someone is trying to hijack the SEO of this company, and I don't know how to proceed. Currently, they have not been penalized by Google, but I would not be surprised if a penalty is on its way due to the obvious link spam. Is there any way to report this to Google to ensure that no penalties occer? Any advice on the issue is much welcomed! Thanks
White Hat / Black Hat SEO | | tqinet0