Is it negative to put a backlink into the footer's website of our clients ?
-
Hello there !
Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop".
But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ?
What is the best practice for a lasting SEO ?
I hope you understand my question,
Thnak you in advance !
-
I asked a similar question to this here http://www.seomoz.org/q/site-wide-footer-links-or-single-website-credits-page and it has possibly useful answers. I'd be interested to hear your views on whether to: A) create site-wide footer links on all pages of my client sites (varying anchor) B) just create a "Website Credits" page and include this inthe sitemap C) create site wide footer links to the website credits page, and link from this back to my site I look forward to hearing your views...
-
I'm actually not sure I agree. From a theoretical, PageRank-passing perspective, sitewide links are better. From a penalty/risk perspective, 1000s of sitewide links can lead to a ton of links coming from very few unique domains, which can start to look suspicious. I actually think you might see less devaluation by limiting the footer links to a couple of strong pages on each client site.
-
Practically, I think Julie is right, but I have seen heavy devaluation of these footer links in the past year or two. They'll still count for something, but not a lot. The only warning I'd add is that I wouldn't create a situation where these are your ONLY links. You could risk looking like a link farm and even a potential penalty at that point. These easy links should be only one part of your link-building strategy.
I'd also highly encourage diversity. Mix up the anchor text, as long as it's relevant, and maybe even put the link different places. If you can get contextual links somehow (not footers or sidebars), that's a huge plus. The more you can mix it up, the better.
-
From a pure SEO perspective, it's better to have the link in the footer appearing on each page.
-
Speaking personally I'm not in favor of it but more from an appearance perspective. I've seen a lot of cases where this is abused by smaller operations who aren't taking their customer's overall outbound link profiles into account. We've inherited projects where the previous designer put about 100 words into the META author tag spamming his keywords, and then in addition put at least a paragraph of ALT text on his footer link. The client didn't even know it was there, or what it necessarily meant.
I also think it detracts from the appearance/professionalism of larger clients sites. I think personally I'm moving towards either very subtle and small center-footer links, with the full knowledge of the client, or a paragraph and link on the About US/Partners page. Note this is my opinion on what we're doing and not meant as an indictment of anyone else's practices.
-
Thanks for your "enlightenment"
I wonder if it wouldn't be better (on the pure seo perspective) to only put a link on the credit page for exemple ?
-
This is standard practice of almost all web design agencies. Giant blog platforms like Wordpress and Blogger also put in a credit link by default. The presence of a credit link like you described the footer will not hurt you in any way.
There is some debate about whether or not it will help you at all (I think it will -- footer links are greatly discounted, but still seem to count for something) but it won't hurt, and it makes sense from a branding/advertising perspective.
-
I know with Panda 3.3 update this past week, there has been some change to the way Google interprets back links. So, I'll be curious what the opinions of other people would be. Personally, I wouldn't put a link in the footer of client sites........just my opinion.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links to internal search with pharma spam anchor text. Negative seo attack
Suddenly in October I had a spike on inbound links from forums and spams sites. Each one had setup hundreds of links. The links goes to WordPress internal search. Example: mysite.com/es/?s=⚄
White Hat / Black Hat SEO | | Arlinaite470 -
Disavow or not? Negative SEO
Since last November we have been receiving a lot of low quality backlinks from over 700 websites. It looks like one of our pages from our website has been copied with the links being kept as they are. I have left a link to an example of this here: https://goo.gl/eWQODJ Please note, all examples seem to be copied in the same way. We have also started seeing a decrease in the amount of organic traffic (Analytics Picture), As you can see the decrease is not yet so drastically high, but it is still a decrease and this is the third consecutive month we have seen this decrease. Do you think it is worth it to use Disavow tool for all of these bad link or not? uuuLt
White Hat / Black Hat SEO | | Tiedemann_Anselm1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
Blog commenting - dos and don'ts
Dear Community, I'm getting into blog commenting heavily now for the relationships I'm building with other bloggers. I think the relationships I will build with these other influencers will be helpful. But I'm concerned that Google may penalize my site if I have a lot of links coming from blog commenting. If I sense that a blog is spammy, obviously I stay away. I've also noticed that a number of CommentLuv sites include a link to my latest blog post, and that has helped me greatly in promoting my posts and building readership. I am also interested in the follow links I get from it, but concerned in that regard that (1) Google won't count those follow links (won't pass page rank) and (2) Google will penalize me for some reason or in some way. What does everyone think about this approach of blog commenting, and in particular, including posting some comments on CommentLuv blogs. Thanks! Mike
White Hat / Black Hat SEO | | Harbor_Compliance0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0 -
New website :301 redirection of a established domain
Hello , I am launching a new website which would host user generated content . Based on my brandname i have purchased a new domain . In order to improve SEO rankings i was considering to purchase a good quality domain (have gr8 link backs) and then perform 301 redirection of the domain to the new brandname.co.in domain . Does this work ? Is there any harm in doing this ? . Does the Link juice pass naturally ? Warm Rgd
White Hat / Black Hat SEO | | ShoutOut0 -
How do you deal with spammy backlinks?
One of the web designers I work with asked me to do a preliminary site assessment on a small business website. The owner of the business had a falling out with his previous web designer and moved over to the one I work with earlier this year. The site has been redesigned but when analyzing the back links I discovered that the previous designer had created directories to create backlinks to the page. The PR 0 links from the site number about 150 and are from unrelated topic pages. So, it made me wonder, how much damage can spammy backlinks do? What is the best practice to deal with spammy backlinks if you find them?
White Hat / Black Hat SEO | | TheARKlady0