Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What heading tag to use on sidebars and footers
Hello, I have some awareness of how to use H1, H2 and H3.
On-Page Optimization | | kowston
H1 only once per page as the main page heading.
H2's should be subheadings, H3's are sub-sub headings of the and so on.
This structure gives hierarchy and opportunities to use additional keywords in an order of priority. I can clearly understand how this would work in an article but what about other content on the page such as global/frequently repeated elements like sidebars and footers? I see sites - and in particular, I have examed SEO focused sites - that use H3, H4 and H5 in these instances seemingly giving themselves scope to use at least H2 tags as part of the page content and break out of the structure hierarchy when dealing with sidebars and footers. I suppose this could signal theses headings are sections of the page that are less relevant than the main article content but that is just an assumption. I don't know what is correct.0 -
Too much internal linking?
Hi everyone, Too much of anything is not good. In terms of internal linking, how many are too many? I read that the recommended internal links are about 100 links per page otherwise it dilutes the page's link equity. I have a concern about one of our websites - according to search console, the homepage has 923 internal links. All the pages have a corresponding /feed page added to the page URL, which is really weird (is this caused by a plugin?). The site also has an e-com feature, but it is not used as the site is essentially a brochure and customers are encouraged to visit the shop. I assume the e-com feature also increases this number. On the other hand, one of the competitors we are tracking has 1 internal link site-wide. Ours is at 45,000 site-wide. How is it possible to only have 1 internal link? Is this a Moz bug? I know we also need to reduce our internal links badly, however, I'm not sure where to start. I don't know how these internal links are linked together - some aren't in the copy or navigation menu. When I scan the homepage links using 'check my links', the total links identified for the homepage is only 170. kAeYlTM
On-Page Optimization | | nhhernandez0 -
Should an internal link open in a new tab or in the same window?
Should an internal link open in a new tab or in the same window? Seems like this is an issue that has never had a definitive answer one way or the other. But I couldn't find any recent articles from reliable sources taking a stance and answering this question. Does anyone know if user engagement metrics (time on site, bounce rate, pages per visit) are impacted if a user clicks a link that opens in a new tab? Thanks!
On-Page Optimization | | NicheSocial0 -
Is a Mega Menu with over 300 links in it hurting my rankings?
I got hit pretty badly by Panda 4.0 (1/3 of my traffic lost), and I'm fairly certain it was because Google had potentially indexed over 20 million pages from a site filtering piece of software and got done for duplicate content. I have since fixed that using URL Parameters and that 20 million is down to 2.7 million now and I have submitted a clean site map, so now I wait. I have just done a site relaunch and am trying to determine if there are any other issues. I run an online store, and I have a mega menu with well over 300 links in it - makes the user experience really quick and easy to jump exactly where you want - and then I have about 30 links in the footer. I know there's a 'no more than 100 links on a page' guideline for Moz, but does anyone know if Google is smart enough to see the same header / footer navigation structure on every page of a site and know it's navigation and not water down the rest of the links, or do I need to re-think and simplify my navigation? It's one of those things that's there for a user experience and now I'm worried that I'm being penalised. The site is www dot shopnaturally dot com dot au
On-Page Optimization | | sparrowdog0 -
Is it bad to include google Maps in footer?
We have 5 locations and we were thinking about including a map for each location in the footer. These would be set-up as no-follow links. They could potentially enhance user experience but it also increases size of footer. Right now there are just basic links to pages (sitemap, terms, etc), contact info, social links, and contact form. If we did the maps it would also include link to the individual location pages. Not sure if we are doing too much in footer or need to just keep it basic. Thanks for the help!
On-Page Optimization | | Restore0 -
Too many links on page -- how to fix
We are getting reports that there are too many links on most of the pages in one of the sites we manage. Not just a few too many... 275 (versus <100 that is the target). The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does. Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
On-Page Optimization | | novellseo2 -
Keyword Stuffing in Alt Tags!
Hello, I have on a main page over 50 images. The first page i want to optimize it for MAINKW (let's say). Now, if i use in the alt tags "MAINKW KW1", "MAINKW KW2", "MAINKW KW3" ... "MAINKW KW50" then Google may say that i stuff the MAINKW in that page? Those images are reprezentative for main Categories and i have direct links to them from the main page with the anchors KW1, KW2...KW50.
On-Page Optimization | | VertiStudio0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | | jonigunneweg0