My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
-
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains.
On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues.
When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys.
We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight.
I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/"
It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong.
I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty.
Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down?
We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content.
The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects.
Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem.
I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem!
It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content.
As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
-
The domains in question were all previously owned by me in my webmaster tools account long before this happened. I've since gone and put in an address change request for the site that has the 301s on it to point to the new site.
I'm feeling like I got stuck with a false positive here, but it is taking forever to get re-reviewed. Of course, it is grilling season now, so I'm losing tens of thousands of dollars in revenue per day that we are out of the index.
I realize the answer is probably no, but does anyone have any tips on how to speed up the review process? I could lose a quarter million dollars over the course of a week or two.
-
A doorway page is an old school black hat SEO technique. What webmasters would do is buy domains with high PR or buy expired domains that used to be competitors and then 301 redirect them back to their website. This was in essence buying their links, as the links to the old domains now ended up at their domain.
Are your domains all on the same hosting account or same serer c-block? Are they all registered and verified with Google Webmaster Tools? If not, then Google may seem them as being owned by different people. In that case, it would look to them like you just bought a bunch of domains and redirected them all to your domain.
To you, you were simply finding all the duplicate content out there and consolidating it into one domain the way you think you should. It just didn't look that way to Google. I would recommend claiming and verifying every one of the domains you want to 301 in GWT. Once you have them verified, then redirect them all to your new domain. At that point, file a reconsideration request with Google, explain your situation, show how you have all the domains verified and that they belong to you, and you should end up okay.
My best guess based on what you're saying is that Google thought all of your domains were under separate ownership, and to see them all 301 all at once looks like you just bought a bunch of other domains and redirected them to yours.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Do you see sites with unfixable Penguin penalties?
Hello, We have a site with 2 Penguin update penalties (drops in traffic) and one quality penalty (another drop in traffic) all years ago, both just drops in rankings and not messages in Google Console. Now that Penguin is hard coded, do you find that some sites never recover even with a beautiful disavow and cleanup? We've added content and still have some quality errors, though I thought they were minor. This client used to have doorway sites and paid links, but now is squeaky clean with a disavow done a month ago though most of the cleanup was done by deletion of the doorways and paid links 9 months ago. Is this a quality problem or is our site permanently gone? Let me know what information you need. Looking for people with a lot of experience with other sites and Penguin. Thanks.
White Hat / Black Hat SEO | | BobGW2 -
Are businesses still hiring SEO that use strategies that could lead to a Google penalty?
Is anyone worried that businesses know so little about SEO that they are continuing to hire SEO consultants that use strategies that could land the website with a Google penalty? I ask because we did some research with businesses and found the results worrying: blog farms, over optimised anchor text. We will be releasing the data later this week, but wondered if it something for the SEO community to worry about and what can be done about it.
White Hat / Black Hat SEO | | williamgoodseoagency.com0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0 -
Is domain name or page title "safe" as anchor text?
I am aware of the dangers of excessively optimized anchor text I have seen some suggestions that as long as your anchor text is either the URL or the page title that this will be OK, no matter how many links come in with that anchor text. Does anyone have an opinion, or even any hard data on this? Thx Paul
White Hat / Black Hat SEO | | diogenes0