Keyword links in footer
-
Hi - I am trying to help a site to get out from under a Google manual action penalty - down as "Partial Matches - Unnatural Links to site".
I am checking through their links - the site that links most to them is a local directory style site - it has 2,682 links back into 1 page (Home) The directory site was built by the web co. that built my clients' site and they put a keyword link in the footer of the directory site - the keyword was "Buy Truffles". All my instincts say that is a bad thing! But - this is what is perplexing me - they are ranking no.1 for that keyword! Whereas they have lost rankings (i.e. not top 50) for all the other keywords they were targeting. So I don't get it! Can anyone explain why this is. I feel I should I get that link removed but don't want to take out their only ranking keyword! Webmaster shows about 55 different pages in the directory site have a link back to my client. Hope you can help.
Cheers - Steve -
Ahh I see Steve.
So there is a partial penalty to the site? If this is not the culprit, I would search out those that are. Detoxing is certainly an eye opener and you will most definitely find work to do with that.
Andy
-
Hi Andy - thanks for the quick reply and confirming what I thought about the keywords in footer.
But I can see that my question wasn't phrased well enough. Where I said "... this is what is perplexing me - they are ranking no.1 for that keyword! " It is actually my clients site that is ranking no. 1 for that keyword which is embedded in the footer of the directory site.
This is what doesn't make sense. I'd have thought that Google would have stomped on that as one of the first things they did. Instead it's the only keyword they have that is still ranking.
Just to say that I have performed a full test on all the back links using Link Detox software and now I am also going through manually and actually reviewing each link. Time consuming but also quite educational.
Thanks again.
-
I think you will find your gut feel here is correct to get this removed. Matt Cutts has actually said that widget and footer links are not what they are looking for, but have you actually performed a full test on all back links to see what sort of a state everything else coming back to the site is in?
For now, try not to focus on what this other site are ranking for - there is a reason Google have given them this placement. You need to disavow / have these dodgy links removed. They are really focusing heavily on links right now, so it is important to remain whiter-than-white in this aspect.
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Curious Keyword Tags Question...
In 2012, we got hit with something... I have always assumed Panda... We have hundreds of thousands of products on our site. Prior to the traffic drop, our old site design listed a small number of keywords tags on the product pages. About 10 or so... After the site re-design, we allowed all of the keyword tags to appear on these product pages and also linked them to our search results pages. I know that one thing this did is cause a lot of these Search Results pages to be indexed. But our traffic has been constantly declining since then... I wonder what would happen if I just went back to the old with a smaller number of keywords listed and not linked? Any thoughts? Thanks! Craig
Technical SEO | | TheCraig0 -
Will you get more 'google juice' if your social links are in your websites header, rather than its footer?
Hi team, I'm in the process of making some aesthetic changes to my website. Its getting quite cluttered so the main purpose is to clean up its look. I currently have 3x social links in the header, right at the top, and i would really like to move these to the footer to remove some clutter in the header. My concern is that moving them may have an impact on the domains ranking in google. Website: www.mountainjade.co.nz We've made some huge gains against our competitors over the past 6 months and I don't want to jeopardise that. Any help would be much appreciated as i'm self taught in SEO and have learnt through making mistakes. This time however, with Moz, i'd rather get some advice before I make any decisions! Thanks is advance, Jake S
Technical SEO | | Jacobsheehan0 -
Keyword in Domain Name
Hello!My website is www.enchantingquotes.com. I also own the domain www.enchantingwallquotes.com,which forwards to my site. About 90% of my business comes from the keyword "wall quotes". Should I consider changing switching to the enchantingwallquotes.com domain and redirecting? And if I do, do I need to recreate the entire website or is there an easier way that I am overlooking? Thank you for any advise/insight!
Technical SEO | | eqgirl0 -
Navigation links tagged as H3
I'm reviewing a website that has used the H3 tag in the Navigation Menu. I've not seen that before and first thought is it is a dodgy. Tags should relate to content on the page not link to another page. As a result of using H3 in Nav the ratio of content wrapped in Heading tags vs body content is quite high. My recommendation is to remove H3 tag from Nav but having searched Moz and not finding an article to verify that recommendation thought I'd ask the question.
Technical SEO | | NicDale0 -
Affiliate links
Is there a best practice for linking out to affiliates URLs post panda? I know some believe it can be a factor.
Technical SEO | | PeterM220 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0 -
How is link juice passed to links that appear more than once on a given page?
For the sake of simplicity, let's say Page X has 100 links on it, and it has 100 points of link juice. Each page being linked to would essentially get 1 point of link juice. Right? Now let's say Page X links to Page Y 3 times and Page Z 5 times, and every other link only once. Does this mean that Page Y would get 3 "link juice points" and Page Z would get 5? Note: I know that the situation is much more complex than this, such as the devaluation of footer links, etc, etc, etc. However, I am interested to hear peoples take on the above scenario, assuming all else is equal.
Technical SEO | | bheard0