A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
-
Hey Everyone,
So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality.
In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help.
Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper.
So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages.
So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever.
ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them.
ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998.
All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998)
Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not.
Many of the pages use other shady tactics such as meta refresh style bait and switching.
For example:
The page title in the SERP shows as: Personalized Watch Boxes
When you click the SERP and land on the doorway page the title changes to:
Personalized Wrist Watches. Not one actual watch box is listed.
They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game.
Executives LOVE revenue.
Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me.
I can go on and on but I think you get where I am going.
I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes.
I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can.
My actual questions would be:
Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic?
Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug?
How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone?
Thank you guys so much in advance,
-Ben
-
glad to help
-
You are a genius.
-
Glad i could be of some help,
If I were you I'd definitely grab copies of the pages if they're still live, you could do this from home even using some free tools like
http://phpcrawl.cuab.de/about.html
add a bit of Curl or WGET and you've got the pages plus the links and meta. Then if they do disappear suddenly and the business is stuck, you can hand this to your web people at oracle and they'll probably try and hire you, having said that, I'd imagine they've probably got a decent contingency plan because they're oracle, but you never know. Could save the day.
-
Thanks Jamie!
Yea we actually partner with Oracle for our web design, engineering , implementation and so on. So when it comes to server-side issues, we would have to go through them and there is always red tape involved.
Really I cannot understand how this vendor that does this is even in business and it is beyond me how they even get away with it. The wordpress 404 plug-in is a great idea though and that will definitely help me in the future with freelancing while I am here full-time.
-
We do self-canocialize and that is a very good question. What they will do is just keep spitting out dynamically generated URLS. They have absolutely no restrictions on page quality, content, they literally have no rules. This gives them immense flexibility.
And for the contract portion: One the contract ends, all of these pages will in-fact disappear and that is why they house them on their own servers. So that is what we want in the end.
It is dealing with the massive amount of 404s that will be an issue for awhile.
-
Thanks again!
Yes, that is the conundrum I am in here when it comes to "who actually owns the pages" and honestly, this vendor covered their bases. They actually house all of the pages on their own servers and basically scrape out site, then shoot them out through our CDN via a proxy or something like that. So they made sure we are at their mercy, they can pull them anytime they want.
So technically, If were were to redirect all of their pages and acquired links, it would actually not be too hard because each page is so unbelievably identical to our own organic pages. The problem is, we would have to access their server I believe and that will not happen.
It will also be one hell of a mess with 301s if we were to do that and I know someone I am planning with on our site team fears the length of the 301 chain this would cause in our htaccess file.
But we are thinking in the same ballpark as you mentioned - trying to find ways to somehow limit the 404 tsunami this would cause and see if we can "take back" some of the value they took from us in link juice.
-
Yes, redirections are 100% necessary. I agree whole heartedly.
-
Surely you can block them once the contract has been ended? I don't know how the law works where you are, but in the UK if you sever a contract you are no longer bound by it. But then again, I'm not a lawyer!!! LOL I'd be earning twice as much if I was!!! I'd look into this or get your legal team (assuming you have one) to look into it for after the contract has ended.
If they're scraping, could you put a canonical tag on your pages to self canonicalise? Only just thought of this!!! Might help, if you've not already done it.
-
Hey thanks so much for the response!
And there are no stupid questions!
Before I was hired here, the company was incredibly aggressive with PPC and CSE's and spent absorbent amounts on paid traffic.
The company literally drove 2x more traffic through paid than through organic. That has changed now even though we still spend pretty aggressively. We have an excellent SEM Digital Marketing Manager that handles all paid campaigns and affiliate programs and she is run ragged on a daily basis.
I really do think it would be worth taking a look at how we can compensate with PPC on the black-hat vendor's best performing URLs and thank you so much because it is an excellent idea.
To your robot blocking question:
I would love nothing more than to insert robot text that disallows Google Bot from crawling the tree sub folders that contain all of their doorway pages. Unfortunately, they entered into a legally binding contract and this would be like an act of war against them. I actually dream about doing this to them every night so that is an awesome point you bring up!
-
Thanks so much for the response.
Your advice on having a battle plan is perfect and is something that I have had to try, try try try and once I am done trying, I try again to find more creative ways to present SEO needs, site fixes and strategies.
I even went so far to show them what their page title look like in search when they are 90 characters long and compared them to that shady gas station on an isolated highway when we could be optimizing the titles, increase CTR and add some schema to product page SERPS to make them look like Sheetz!
Full PowerPoint pictures of gas stations!
The enigma of pushing SEO when nothing is "guaranteed" but the numbers they are seeing from this black-hat vendor are.
Yesterday, digging deeper and deeper using Screaming Frog, I dug into one of this vendor's sub folders that is a giant index (They have three of these sub folders they upload to our site)
I actually found that they are literally completely copying our product pages and making exact copies. They then insert basically meta spam links on the product pages that ensures that their copies will usually always out rank our original content that we have three writers working on.
Unbelievable I know. So with your awesome advice and internal reminder on how much more I need to think outside the box with presenting, I am going to make an entire roster of this plagiarized pages and show them that if all of these copied product pages were removed, our own organic product pages would show as they are meant to.
I cannot believe vendors still can get away with this. No one monitored them or had any idea what they were doing until I was hired it is just beyond belief.
Thank you so much for the advice and inspiration.
-
Nicely put, Amelia!
PPC would definitely be a great alternative to make up any losses from organic search. And, PPC Hero is indeed a great resource as is the AdWords help center.
From a technical standpoint, one would still want to have all of those crappy vendor pages re-directed somewhere, which would be a pain to manually do but a necessary pain. If not, they would be sending a huge amount of 404 errors and that's not going to be a good sign for Google. The pages are already indexed since they are getting traffic and you'd want to send that traffic, and any links associated with that page, somewhere - ideally, in your situation, a much better (relevant) page from a user and search engine perspective.
-
What a pig awful situation to be in. I feel for you.
The previous poster has some great suggestions which I would follow.
May I also suggest that you start a PPC campaign of your own to 'pick up the slack' as you put it? Assuming the budget previously allocated to this vendor would cover it? If the vendor was using PPC as the revenue driver to these horrible UX pages, imagine how much better the conversion would be from one of your 'good' pages?
If you've never used Adowrds before, then I would look at the adwords education center for a bit (sorry I can't remember what it's called). A good site I used to use when first starting out learning Adwords is PPC Hero - they had some good tips a few years ago, and I have no reason to believe they've gone downhill! I think (and hope I don't inadvertently offend anyone here, but it's my experience) that if you can do SEO then running PPC (though time consuming) should be easy enough for you to get your head around.
I don't know if this is a stupid suggestion or not as I'm not very technical (I rely on brilliant developers in my team) but could the vendor's dodgy pages be disallowed by your robots file? Could you also remove them from the index via webmaster tools (especially if the pages are just PPC landing pages and not built for organic search, which I understand is the case from your post)? Like I say, this may be a stupid suggestion... Please go easy on me if it is!!!
Good luck - and remember, 'what doesn't kill us, makes us stronger'. I bet you're a much better SEO now than you were a year and a half ago!
-
I feel like your best plan of attack will be two sided:
1.) Education - Which is a definite struggle, but helping your higher-ups really understand WHY these practices are an issue and how it could and eventually will impact their bottom line might resonate more than just saying there are issues present (which I am sure you have been doing anyway). Perhaps reiterating the amount of revenue that is a result of natural search and how much would be lost if the site were penalized would paint a more clear picture. Having data to support your arguments is always helpful. Maybe you can even do some research and present a few summarized case studies on other sites that have been penalized and how it impacted their natural search metrics.
2.) Plan - Have a plan of attack ready. Ok, so you get rid of these pages... Now what? Preparing a very clear, step-by-step plan on what changes need to be made, what these changes will accomplish and what issues they will address, how you will make them and how long it will take, and what the expected outcome will be will help them better understand the process and how it will help save and possibly even improve revenue.
Hope this is helpful - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
SEO from INDIA Smarter then Google?
Exploring this url: filterscanada.ca In Open Site Explorer, it is clear those guys bought one of those package available on site like eLance.com a low price over seas!!! Where freelancer around the word can be hired for a fews bucks. For example, I post a job on eLance for SEO and most of the freelancer submitting where from INDIA. For just a few hundreds dollars, you can get a complet SEO package. At first, the price was attractive, but when posting on seoMoz and doing research, I came to the conclusion, the techniques they use might hurt more the produce positive result... How can you get a D.A. of 45 using backlinks they get? I read all those things about Google algorithm, and Panda and Penguin and this and that... Being impossible to crack! Do you have a explanation? I work really hard ans spends lots of $$$ to have a clean site selling furnace filters
White Hat / Black Hat SEO | | BigBlaze205
I follow all the SEO guide lines, practice only white hat trying to built somethings, but with a P.A. of 19 and a competitors ranking like this, I ask myself: "Maybe INDIA is smarter then Google and I should do like this site, spend a couple of hundreds dollars and buy myself a high D.A."0 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0 -
Seo back linking proposal review
Hi guys, below is a proposal i received from someone on freelancer.com for some seo building. Is this really all it takes? Obviously done overtime but basically is this it aside from the usual basics onsite keywords, urls, artciles, content etc. This is a the proposal for $250 (some are cheaper but almost the same details as below). This is one of the top seo people on freelancers.com and they all have good reviews. Is this basically it? Shell out $100 bucks or more a month to someone who will just post stuff all over the internet. It just seems all very simple, what is $100 bucks a month to stay at #1. Is there any real questions i should ask to make sure i am not just throwing my money away? I would like to recommend the following services for attaining better search results for the website. 1)Press Release Submissions
White Hat / Black Hat SEO | | topclass
2)Social bookmarking submissions
3)Drip Feed Article Links - 100 Article submissions everyday for 25 days
4)Article directory submissions
5)Link directory submissions
6)Blog Post Submissions(All Blogs have PR1 to PR6)
7)Wiki Page Submissions(.EDU and .GOV Sites Included) PR of the directories, social bookmarking websites, Blogs, wiki pages and Article directories are from PR0 to PR8.
Most of them are in the range of PR1 to PR4. If you are interested in the above services then these are the details about those services. 1)Press release Submissions - We will write 3 press release and submit them to 25 press release websites.
Submitting press release gets the news to Google news, Yahoo news etc..
Please note we even submit to Paid press release websites like PRBuzz, SBWire, pressdoc etc.. 2)Social Bookmarking submissions - I will also submit your website to 150 social bookmarking websites.
Here are the example of social bookmarking websites.
www.digg.com
www.furl.net
After we finish submitting to social bookmarking websites we will then create rss feeds with approved link URL's and ping them so that links get indexed. 3)Drip Feed Article submissions - We will be writing one article.
Everyday we will submitting the article to 100 different websites.
We will be submitting for 25 days.
100 submissions x 25 days = 2500 submissions.
In each article submissions we can use 2 links to the website. 4)Article directory submissions - We will write 5 articles.
Each article will be around 500 words.
Then we will submit them to 300 different article directories. That means 5 articles x 300 article directories = 1500 article submissions.
In each article we can use 2 links to the website.
1500 x 2 Links.
I have experience in submitting articles to article directories.
Till now i have submitted more than 1000 articles to article directories.
I will also create separate accounts with article directories wherever possible. 5)Link directory submissions - I have a list of 1300 directories.
I will submit your website to these directories.
I have experience in submitting to link directories.
Till now i have submitted more than 2500 websites.
All the submission work is done manually.
All these directories provide one way links. 6)Blog Post Submissions(700 PR1 to PR6 Blogs) - We will write 1 article.
we spin and post to 700 PR1 to PR5 blogs.
We can spin the article, title of article and links
You will be given a confirmation when complete, and a code to search backlinks in the search engines.
They are hosted on 650 different C Class IPs! 7)Wiki Page Submissions - Get 200+ wiki site contextual backlinks (3 per posted article) from a range of PR 0 to 8 wiki sites including over 30 US .EDU and US .GOV sites.
I will also ping Them.0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Passing page rank with frames - Is this within Google Guidelines?
It appears this site is gaming Google for better rankings. I haven't seen a site do it this before way before. Can you tell me what enables this to get such good rankings, and whether what they are doing is legitimate? The site is http://gorillamikes.com/ Earlier this year this site didn't show up in the rankings for terms like "Cincinnati tree removal" and"tree trimming Cincinnati" etc. The last few months they have been ranking #1 or #2 for these terms. The site has a huge disparity in MozRank (8, very low) vs. Page Rank (6, high). The only links to this page come from the BBB. However, when you look at the source code you find 100% of what is displayed on the site comes from a page on another site via a frame. The content is here: http://s87121255.onlinehome.us/hosting/gorillamikes/ When I go to onlinehome.us I'm redirected to http://www.1and1.com/. I'm only speculating, but my guess is onlinehome.us has a high page rank that it is passing to http://gorillamikes.com/, enabling Gorilla Mikes to achieve PR of 6. Does this make sense? In addition, the content is over optimized for the above terms (they use "Cincinnati (Cincinnat, OH)" in the first three H2 tags on the page. And all of the top menu links result in 404 errors. Are the tactics this site is using legitimate? It appears that everything they're doing is designed to improve search results, and not in ways that are helpful to users. What do you think?
White Hat / Black Hat SEO | | valkyrk0