City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To NoFollow or to NoIndex internal links
I all, I have recently taken over a fairly large e-commerce site that I am trying to "fix" and have come across something that I need a second opinion on. A Semrush audit has revealed that there are a heck of a lot of internal nofollow links (over 90 000) that point to predominantly 4 pages from the Header of each page in the site, these are change currency pages to show clients different currencies and a members login page. The pages are: /?action=changecurrency¤cy=EUR /?action=changecurrency¤cy=USD /?action=changecurrency¤cy=GBP /members/ My opinion is that these pages should just be no index pages and they should be followed. instead of being indexed and no followed? Any thoughts on this out there?
On-Page Optimization | | cradut0 -
Whether to open a new window for an internal link
I'm aware of the advice to add internal links between pages and so I always add links as and when appropriate. However, my website builder allows me the option to open the link in a new window or in the originator. I invariably choose the former but don't know if this is best practice. Could anyone advise?
On-Page Optimization | | Catherine_Selectaglaze0 -
Blog issue broken link
Taking Great Photographs Underwater May 25, 2015 By sdwellers@aol.com No comments yet florida keys, key largo diving Excuse my ignorance, I suspect this is an easy issue...but at the the top of each of my blog posts have what you see above....the "No Comments yet" tab is showing as a broken link 404 error...?Why? And how to fix?Thank you
On-Page Optimization | | sdwellers0 -
Links to Paywall from Content Pages
Hi, My site is funded by subscriptions. We offer lengthy excerpts, and then direct people to a single paywall page, something like domain.com/subscribe/ This means that most pages on the site links to /subscribe, including all of the high value pages that bring people in from Google. This is a page with an understandably high bounce rate, as most users are not interested in paying for content on the web. My question is are we being penalized in Google for having so many internal links to a page with a very high bounce rate? If anyone has worked with paywall sites before and knows the best practices for this, I'd be really grateful to learn more.
On-Page Optimization | | enotes0 -
Not sure if I need to be concerned with duplicate content plus too many links
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes. We seem to have a lot of duplicate content and duplicate titles. This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed? The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do? Many thanks.
On-Page Optimization | | Niamh20 -
Linking from Subfolders
Hi, on my new theme there are a bunch of links from subfolders to pages that aren't in a sub folder, these pages also don't have links on the home page. Is there an seo issue when linking from subfolders to pages that aren't. I don't know if I'm wording this properly but what I mean is, mysite/project/project1 linking to mysite/post.
On-Page Optimization | | FPK0 -
Internal Linking
I am trying to figure out internal linking. Please help me. Your "root domain" (the top level example.com) is the easiest to rank on a SERP. When you build Page Rank on this page, you want to make sure the majority of the PR goes into internal pages that matter. To do this you determine what internal pages are most important and put them on the menu bar. You then link to these pages in the body text, or via side bars. This will ensure that the PR is flowing from the root domain into the internal pages multiple times. The second part is to link from these secondary pages back to the main page. Correct? When you build back links on the internal page, you want to pass the PR back to the main page... Please discuss this...
On-Page Optimization | | JML11790 -
I have a question about on page links or duplicate contant
Ok help me out here friends. I’m working with the warnings and errors for my site. I have two problems that relate to each other and I want to know if you had to choose what problem what would you choose. I’m running into some duplicate content and title errors because under categories for my products there are so many products that it creates more than one page and with each new page it has the same title or same content on the page. I tried to make this less in some cases by showing more products per page like 100 items and in most cases per category it will only show one page now. Now some times there’s still more than one page and also this creates too many links now on that category page. So I think I can get rid of all the to many on page links but it will make more pages and duplicate content and title tag. What would you guys do?
On-Page Optimization | | Dataken0