Would 37,000 footer links from one site be the cause for our ranking drops?
-
Hey guys,
After this week's Penguin update, I've noticed that one of our clients has seen a dip in rankings.
Because of this, I've had a good link at the client's back link profile in comparison to competitors and noticed that over 37,000 footer links have been generated from one website - providing us with an unhealthy balance of anchor terms.
Do you guys believe this may be the cause for our ranking drops?
Would it be wise to try and contact the webmaster in question to remove the footer links?
Thanks,
Matt
-
Hi
continue optimizing and do not assume it will get back to where it was before.
I see some of the site I optimize stuck in new positions and only a very slow movement upward.
I guess it takes a lot of work to move up again
-
When you say a "waiting game", do you mean weeks? Months?
We've now returned to page three for the keyword.
-
Hi Tom
this may be a waiting game.
I would assume this link is not the only reason for the drop in ranking.
continue with your site review. get busy creating content and clean any other unnatural inbound links.
-
Just an update on this; after removing the footer links we're still seeing a downward movement for one of the most competitive keywords that we were ranking on the first page for.
As I mentioned in my last message, a couple of days ago it dropped to third page and now it's on page 8! Should we be looking at a reconsideration request even though we didn't receive an actual warning from Google?
-
We've now managed to remove the footer links that were incoming from the website I mentioned.
Any advice on the next stage of the process? Or is it just purely a waiting game? Should we submit a reconsideration request? Will we be able to see any immediate changes?
To give you an idea of the scale of the ranking dips; the site was originally floating around 6/7/8 for a competitive keyword but has now dropped to the third page.
-
Hey Tom,
That's great. Just the sort of confirmation I was looking for.
I'll get on to the process of removing these links now.
-
Hey there
That's really an unnatural amount. And if it looks unnatural it's always in line for a penalty.
Has a commercial anchor text been used? Or is it brand? Even if it's brand, it's still a grotesque amount of links, while many of the pages linking to your client's site may be from low quality pages. If it was a commercial anchor text, partial or exact match, then I'd be in little doubt this would cause the penalty.
I'd get the links removed ASAP, or ask the webmaster to make the links nofollow, although that may not suffice now the penalty has been put down.
If this coincided with the penguin update, the likely thing is that you won't see a recovery until the algorithm refreshes - unless we hear otherwise from Google - which is pretty frustrating. A manual penalty can be appealed any time with a reconsideration request, of course.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Why are Google Webmaster Tools' Google rankings different to actual Google rankings?
Dear Moz, We have noticed that according to Google Webmaster Tools one of our client sites is ranking very prominently for some of the major key phrases that we are trying to rank them for. However, when we perform a Google search for these queries, our client's content is nowhere to be seen, not even on the 5th page (we logged out of the Google account before performing the test). A long-term manual spam action on our client's site was recently lifted by Google - is it possible that Google Webmaster Tools is providing data about our client's estimated Google rankings, without taking into consideration the penalty of the manual spam action which was taken? Thanks
Algorithm Updates | | BoomDialogue690 -
What are tips for ranking on Google Maps?
I have another thread going where everyone is saying to keep both the Places profile as well as the Google Plus Local profile I have for my company. I have another person telling me that it has a negative effect to have both accounts at the same time so I'm assuming thats why the listing never comes up on places unless you zoom all the way into the map to the address of the storefront. With that being said, can anyone provide some good tips for ranking first page on google maps? Goole Plus Local - https://plus.google.com/114370561649922317296/about?gl=us&hl=en Google Places - https://plus.google.com/103220086647895058915/about?gl=us&hl=en
Algorithm Updates | | jonnyholt1 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
A rating on report card - Massive drop in rankings (Is it Penguin?)
Hi All, I would really like a second opinion on my current issue and I'd be very grateful for the communities knowledge please . I have a been running a successful campaign for one of my clients, increase in natural SERP's, etc. But since the 24th of April they were hit and hit hard and continued to decline now. They had many top 5 listings for terms and now most are not in top 50 serps. Openly I know they have a bad or not great backlink profile which was inherited from the past agency, but we're working on this using white hat techniques, contextual copy, brand etc. The copy is a great, contextual and no where near spammy at all... Very nicely written, factual, etc. What I don't understand is this SEOmoz's On-Page report card gives an A rating for the target term which is also a secondary term but was ranking high. Factor Overview <dl class="scoreboard clearfix"> <dt>Critical Factors</dt> <dd>4 / 4</dd> <dt>High Importance Factors</dt> <dd>6 / 7</dd> <dt>Moderate Importance Factors</dt> <dd>7 / 9</dd> <dt>Low Importance Factors</dt> <dd>8 / 11</dd> <dt>Optional Factors</dt> <dd>2 / 5</dd> <dd> On-Page Keyword Usage for </dd> </dl> Title 1 URL 0 Meta Desc 1 H1 1 H2-4 0 Body 1 B / Strong 1 IMG ALT 0 Total Keyword Usage for this Page = 5 Fall from grace is SERP 5 to 25 and still going down. So the questions are: 1. Has this happened to you? 2. Could it just be down to their bad backlink profile which now needs much more attention? 3. Is this Penguin for sure? 4. What would you recommend? If anyone requires any further info please let me know and thank you in advance. Cheers,
Algorithm Updates | | JosephGourvenec
Joseph Gourvenec
SEO and Search Specialist0 -
Ranking #1 for Local, Not for National
A client with both a web and brick and mortar store is ranking well for normal web searches locally for many terms but less so nationally. I'm aware that results change due to location and other factors. Specifically, client is wondering if his retail location and corresponding places page are hurting his web results in non-local areas.
Algorithm Updates | | AliveWired0 -
CTR for Google Rankings
I run a local business, and I'm working on ranking for keyword + city. I currently rank on the first page for just about every keyword I'm working on, but only the top 3 for a little less than half. Because the search volume is so low for each keyword (for most cities Google doesn't have an estimated monthly search volume) the grand total of a few searches a month for each keyword + city combination is where I get my traffic. Although I seem to be getting consistently higher in the rankings, I am curious as to how much more traffic I can expect. I read somewhere that sites that are ranked number one are clicked 50% of the time, number two 20% of the time, number three 15% and from there on it goes down fast. Rank 7 and on is below 1%. Probably around 30% of my keywords are ranked between 7-10 and probably about 20% are ranked 4-6. Are the CTR numbers fairly accurate? I understand that there are a lot of influences on CTR, such as title/description, but generally is that somewhat accurate? If it is, I am missing out on A LOT of traffic. I am pulling about 800 unique visitors a month from Google. If I get in the top 3 for most of my keywords, can I expect significantly more traffic? I ask the question because there are many other things I could be doing with my time to help the business aside from SEO. I don't want to be working constantly on SEO if traffic is only going to increase very little.
Algorithm Updates | | bjenkins240