City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should an internal link open in a new tab or in the same window?
Should an internal link open in a new tab or in the same window? Seems like this is an issue that has never had a definitive answer one way or the other. But I couldn't find any recent articles from reliable sources taking a stance and answering this question. Does anyone know if user engagement metrics (time on site, bounce rate, pages per visit) are impacted if a user clicks a link that opens in a new tab? Thanks!
On-Page Optimization | | NicheSocial0 -
On hover my links are with additional Parameters while links that are indexed are without additional parameters
On hover my links are with additional Parameters while links that are indexed are without additional parameters does it impact in a negative way. For ex: i have a site http://www.yoursite.com and Its internal pages that are linked to the site are in pattern of http://www.yoursite.com/jobs-in-india?xz=3_0_5 and these are the pages which are interlinked through out the site. When any user click the link they will land to the similar pages with additional parameter even on mouse hover any one can see the same link. while we have used Canonical, so pages that are getting indexed are http://www.yoursite.com/jobs-in-india. But my concern is: - To showing two different link as when Google crawler follow the site they will get the links with additional parameter while in its index its a URL without additional parameter so is there problem that we can encounter or is there any negative impact on ranking?
On-Page Optimization | | vivekrathore0 -
Internal and Link Juice Analysis - Too Many Links Error
Howdy! I have an analysis question related to internal links/link juice. Here is the general link set up of our site: 1. All Site Pages (Including Home Page): We have drop down "mega" menus in the header of everypage linking to various sub-categories on the site. So, because of this, in our header, we have a few hundred links to various pages on our site and these show up on every page of the site. 2. Product Pages: Header pages as mentioned above, but on top of that, we list out the keywords for that particular product and each keyword is linked back to our search results pages for that particular keyword. In General Moz is telling us we are having between 200-300 links on each product page. Currently, our Search Results pages are ranking higher and showing up in search more than our actual product pages. So, based on the above info, here are some thoughts: 1. Should we ajax in the Header links so that they aren't showing up for the search engines? Or, should we ajax them in only on all pages that are not the Home Page? 2. Should we get rid of the keyword links back to the Search Results pages that are on the product pages? What effect would these changes "actually" have? Does this just improve crawling? Or are there other positive results that would come of changes like these? We have hundreds of thousands of products, so if we were to make changes like these, could we experience negative results? Thanks for your help! Craig
On-Page Optimization | | TheCraig0 -
Links to Product pages
Hello all, I am still rather new to SEO and learning a lot every day. I do have a question. On our product search result pages (example http://shop.ferguson.com/search/bathroom-lighting)
On-Page Optimization | | Ferguson
It is currently set up so the image, text, price etc of a product is linking to that product page. Our question is, if we were to link the image and the product name - will this be seen as two links to the same page? Is this a bad thing having multiple links to the same page? I searched around to see how other ecommerce sites have similar pages setup and it seems they link the image and also the product name, and the description is not click-able, which allows a user to "Highlight" the text (this is not possible on ours) Which would be to correct approach for SEO as well as User Interface, the way we have it set up, or by going with the method of the question I asked, Thank you for any information on this! Nick0 -
404 link | How to remove the link so it is not found?
My report has listed a few links with 404 errors. They are internal links but are not found. Is there a way to remove that link so it is not found again? Thanks
On-Page Optimization | | SavingSense0 -
Internal Links & Title Tags, Which Page Benifits?
The best way I can explain why question is with an example. Lets say I have a parent parent page that is focusing on a broad keyword.
On-Page Optimization | | donford
I also have a sub-page which is focused more on long-tail keyword variations. When I make an internal link and give it a title tag, should I give it the long-tail keyword for the juice, or should I use the broad keyword for the parent page's relevancy? Thanks for any help, advise or pointers.0 -
100 links on one page
we're recommended 100 links or less on one page. is the 100 links including header and footer links?
On-Page Optimization | | jallenyang0 -
Too many on page links
Our home page (and 1400 of our other pages) have well over 100 links, going beyond the recommend amount. Our competitors have less on page links (to other pages on their site) and way more link popularity so we are trying to figure out the best solution for this without hurting our sites conversions and usbaility.
On-Page Optimization | | iAnalyst.com0