Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it negative to put a backlink into the footer's website of our clients ?
-
Hello there !
Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop".
But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ?
What is the best practice for a lasting SEO ?
I hope you understand my question,
Thnak you in advance !
-
I asked a similar question to this here http://www.seomoz.org/q/site-wide-footer-links-or-single-website-credits-page and it has possibly useful answers. I'd be interested to hear your views on whether to: A) create site-wide footer links on all pages of my client sites (varying anchor) B) just create a "Website Credits" page and include this inthe sitemap C) create site wide footer links to the website credits page, and link from this back to my site I look forward to hearing your views...
-
I'm actually not sure I agree. From a theoretical, PageRank-passing perspective, sitewide links are better. From a penalty/risk perspective, 1000s of sitewide links can lead to a ton of links coming from very few unique domains, which can start to look suspicious. I actually think you might see less devaluation by limiting the footer links to a couple of strong pages on each client site.
-
Practically, I think Julie is right, but I have seen heavy devaluation of these footer links in the past year or two. They'll still count for something, but not a lot. The only warning I'd add is that I wouldn't create a situation where these are your ONLY links. You could risk looking like a link farm and even a potential penalty at that point. These easy links should be only one part of your link-building strategy.
I'd also highly encourage diversity. Mix up the anchor text, as long as it's relevant, and maybe even put the link different places. If you can get contextual links somehow (not footers or sidebars), that's a huge plus. The more you can mix it up, the better.
-
From a pure SEO perspective, it's better to have the link in the footer appearing on each page.
-
Speaking personally I'm not in favor of it but more from an appearance perspective. I've seen a lot of cases where this is abused by smaller operations who aren't taking their customer's overall outbound link profiles into account. We've inherited projects where the previous designer put about 100 words into the META author tag spamming his keywords, and then in addition put at least a paragraph of ALT text on his footer link. The client didn't even know it was there, or what it necessarily meant.
I also think it detracts from the appearance/professionalism of larger clients sites. I think personally I'm moving towards either very subtle and small center-footer links, with the full knowledge of the client, or a paragraph and link on the About US/Partners page. Note this is my opinion on what we're doing and not meant as an indictment of anyone else's practices.
-
Thanks for your "enlightenment"
I wonder if it wouldn't be better (on the pure seo perspective) to only put a link on the credit page for exemple ?
-
This is standard practice of almost all web design agencies. Giant blog platforms like Wordpress and Blogger also put in a credit link by default. The presence of a credit link like you described the footer will not hurt you in any way.
There is some debate about whether or not it will help you at all (I think it will -- footer links are greatly discounted, but still seem to count for something) but it won't hurt, and it makes sense from a branding/advertising perspective.
-
I know with Panda 3.3 update this past week, there has been some change to the way Google interprets back links. So, I'll be curious what the opinions of other people would be. Personally, I wouldn't put a link in the footer of client sites........just my opinion.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SSL Importance For Backlinks
I am trying to build some good quality backlinks, how important is SSL for the site that we post guest blogs on? I realize that if a site does not have SSL currently, their DA will likely not go up very fast because of Google's new algorithms, but currently, I am looking at a couple sites with a DA of 40 and 41. By the way, my site has SSL (is https). Thanks!
White Hat / Black Hat SEO | | CSBarns0 -
Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
White Hat / Black Hat SEO | | edmundsseo0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
Finding and Removing bad backlinks
Ok here goes. Over the past 2 years our traffic and rankings have slowly declined, most importantly, for keywords that we ranked #1 and #2 at for years. With the new Penguin updates this year, we never saw a huge drop but a constant slow loss. My boss has tasked me with cleaning up our bad links and reshaping our link profile so that it is cleaner and more natural. I currently have access to Google Analytics and Webmaster Tools, SEOMoz, and Link Builder. 1)What is the best program or process for identifying bad backlinks? What exactly am I looking for? Too many links from one domain? Links from Low PR or low “Trust URL” sites? I have gotten conflicting information reading about all this on the net, with some saying that too many good links(high PR) can be unnatural without some lower level PR links, so I just want to make sure that I am not asking for links to be removed that we need to create or maintain our link profile. 2)What is the best program or process for viewing our link profile and what exactly am I looking for? What constitutes a healthy link profile after the new google algorithm updates? What is the best way to change it? 3)Where do I start with this task? Remove spammy links first or figure out or profile first and then go after bad links? 4)We have some backlinks that are to our old .aspx that we moved to our new platform 2 years ago, there are quite a few (1000+). Some of these pages were redirected and some the redirects were broken at some point. Is there any residual juice in these backlinks still? Should we fix the broken redirects, or does it do nothing? My boss says the redirects wont do anything now that google no longer indexes the old pages but other people have said differently. Whats the deal should we still fix the redirects even though the pages are no longer indexed? I really appreciate any advice as basically if we cant get our site and sales turned around, my job is at stake. Our site is www.k9electronics.com if you want to take a look. We just moved hosts so there are some redirect issues and other things going on we know about.
White Hat / Black Hat SEO | | k9byron0