Massive site-wide internal footer links to doorway pages: how bad is this?
-
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing.
They are formatted as follows:
[" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate
These direct to individual pages that contain the same few images and variations the following text that just replace the town and state:
_Springfield, PA Real Estate - Springfield County
[images]
This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas.
We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield.
And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._
The bolded text links to our internal site pages for buying, selling, relocation, etc.
Like I said, this is repeated several hundred times, on every single page on our site.
In our XML sitemap file, there are links to:
http://www.example.com/Real_Estate/City/Springfield/
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/That direct to separate pages with a Google map result for properties for sale in Springfield.
It's accompanied by the a boilerplate version of this:
_Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere.
This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._
Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed.
How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings.
What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone!
-
So every page has several hundred links to spammy duplicate pages? This is a bad thing.
First off, link equity leaving every page on your site would be split between all these several hundred links whether your NoFollow them or not.
Second, the fact that they are essentially duplicate of each other and there are potentially hundreds of these pages means that your site is likely hurting from thin content penalties.
Third, that image makes it look like old school 90s keyword stuffing. Which is just all sorts of wrong.
My assessment is that Yes... this is hurting your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks in Footer - The good, the bad, the ugly.
I tried adding onto a question already listed, however that question stayed where it was and didn't go anywhere close to somewhere others would see it, since it was from 2012. I have a competitor who is completely new, just popped onto the SERPs in December 2015. Now I've wondered how they jumped up so fast without really much in the way of user content. Upon researching them, I saw they have 200 backlinks but 160 of them are from their parent company, and of all places coming from the footer of their parent company. So they get all of the pages of that domain, as backlinks. Everything I've read has told me not to do this, it's going to harm the site bad if anything will discount the links. I'm in no way interested in doing what they did, even if it resulted in page 1 ( which it has done for them ), since I believe that it's only a matter of time, and once that time comes, it won't be a 3 month recovery, it might be worse. What do you all think? My question or discussion is why hasn't this site been penalized yet, will they be penalized and if not, why wouldn't they be? **What is the good, bad and ugly of backlinks in the footer: ** Good Bad Ugly
White Hat / Black Hat SEO | | Deacyde0 -
A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
Hey Everyone, So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality. In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help. Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper. So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages. So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever. ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them. ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998. All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998) Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not. Many of the pages use other shady tactics such as meta refresh style bait and switching. For example: The page title in the SERP shows as: Personalized Watch Boxes When you click the SERP and land on the doorway page the title changes to: Personalized Wrist Watches. Not one actual watch box is listed. They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game. Executives LOVE revenue. Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me. I can go on and on but I think you get where I am going. I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes. I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can. My actual questions would be: Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic? Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug? How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone? Thank you guys so much in advance, -Ben
White Hat / Black Hat SEO | | VBlue1 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Site review
Can any one give me a quick site review, recently started work for the company on the seo, just want to asking if I am missing anything that may hinder SEO and SERPs etc www.teamac.co.uk
White Hat / Black Hat SEO | | TeamacPaints0 -
Are these links bad for my results?
In the past we have requested links on multiple directories. Since we have seen a mayor drop (60% in traffic) in results around the pinquin update 24-26th of April. Our results have been slowly getting lower and lower in Google. Is it possible to tell if these links are in fact doing my site harm? Before the 26th of April it was easy to see that the results where benefiting from the submission to those directories. We did not have any messages in webmaster tools and reconsideration says "no manual spam action taken". What would be the best strategy to turn this around and go up again? A selection of the requested links can be found below. <colgroup><col width="266"></colgroup>
White Hat / Black Hat SEO | | 2Hillz
| www.thesquat.org |
| www.directmylink.com |
| www.thegreatdirectory.org |
| www.submission4u.com |
| www.urlmoz.com |
| www.basoti.org |
| www.iwebdirectory.co.uk |
| www.freeinternetwebdirectory.com |
| addsite-submitfree.com |
| opendirectorys.com |
| www.xennobb.com |
| mdwerks.com |
| www.directoryfire.com |
| www.rssbuffet.com | To give a good view on the problem: The requested links anchors are mostly not in the native language of the directories. Thanks!0 -
Link Building after Google updates!
Hello All, I just wanted to ask the question to start a discussion on link building after the Google Updates. I haven't been very proactive lately with regards to link building due to the updates and not wanting to get penalised! Are there any link building trends/techniques people are using since the changes? Thanks, seo_123
White Hat / Black Hat SEO | | TWPLC_seo0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0