City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster Tools - How your data is linked?
This may be an easy questions, but I can't seem to find the answer anywhere and I never really looked into it before. In google webmaster tools, in the dashboard there is the section that says "How Your Data Is Linked". What does that refer to? Is that just using internal link anchor text, external link anchor text or a combination of both? I am pretty sure that it is a combination of both, but I just want to make sure before making some internal link changes so that the most common anchor text is no longer "Prices" and "Sign up". Thanks.
On-Page Optimization | | rayvensoft0 -
How many outbound links is too many outbound links?
As a part of our SEO strategy, we have been focusing on writing several high quality articles with unique content. In these articles we regularly link to other websites when they are high quality, authoritative sites. Typically, the articles are 500 words or more and have 3-5 outbound links, but in some cases there are as many as 7 or 8 outbound links. Before we get too carried away with outbound links, I wanted to get some opinions on how many outbound links we should be trying to include and more information on how the outbound links work. Do they pass our website's authority on to the other website? Could our current linking strategy cause future SEO problems? Finally, do you have any suggestions for guidelines we should be using? Thank you for your help!
On-Page Optimization | | airnwater0 -
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
WBF told me to get rid of my low contrast footer links...
I just finished watching WBF where Rand took a moment to identify some of the potentially harmful SEO practices that could be penalized in the upcoming algo update targeting over-optimization. (Great post BTW!) One of which was using low contrast, exact match footer links to inner pages. But I couldn't help but notice something similar being done on the SEOmoz site. In the attached image, I compare this to a site I've done using a similar practice. What are your thoughts on footer links found in this example and how should we, as SEOs, handle footer links in the future? footer-links.gif
On-Page Optimization | | AlexanderAvery0 -
Google found bad links delete them or 301 redirect?
we went into our google account and saw about 70 bad links that they found on our site. what's the best thing to do, seo-wise: should we go into the pages that have the bad links and delete them from the html code, or re-direct them in our htaccess script?
On-Page Optimization | | DerekM880 -
Does Too Many On-Page Links on a Page Really Matters?
Does Too Many On-Page Links on a Page Really Matters? Especially if they are pointing to internal page?
On-Page Optimization | | AppleCapitalGroup1 -
Nofollowed internal links from the home page
Hi, I'm conducting an on-page review for someone and have noticed something I've not seen before. Some of the major internal links from the home page are marked as no follow. For example: <a <span="">href</a><a <span="">="</a>/customer-services" rel="nofollow">Customer Services This is on the top navigation bar and the content in this and all other sections are marked as no-follow but they should all be crawled. Is this an error or am I missing something? Any ideas guys? Thanks Bush
On-Page Optimization | | Bush_JSM1 -
Link Product Thumb & Product Name with same anchor link?
We have an issue on one of our sites we're monitoring a campaign for that seems to have TOO many links on each page. I think the biggest reason is that each product listing on each category page has two separate anchor links into that page. One for the thumb and one for the name. So even though there should only be 60-70 links on each category page, that amount is being inflated because each product listing technically is being split into two separate links. Question is, should I place the thumbnail and name within the same anchor link? We do this on a lot of other sites we operate, but I'm not sure what's a better strategy. It would seem to me that it would be better to have a single anchor link that shares the thumb and product name.
On-Page Optimization | | AarcMediaGroup0