My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
-
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains.
On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues.
When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys.
We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight.
I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/"
It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong.
I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty.
Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down?
We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content.
The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects.
Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem.
I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem!
It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content.
As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
-
The domains in question were all previously owned by me in my webmaster tools account long before this happened. I've since gone and put in an address change request for the site that has the 301s on it to point to the new site.
I'm feeling like I got stuck with a false positive here, but it is taking forever to get re-reviewed. Of course, it is grilling season now, so I'm losing tens of thousands of dollars in revenue per day that we are out of the index.
I realize the answer is probably no, but does anyone have any tips on how to speed up the review process? I could lose a quarter million dollars over the course of a week or two.
-
A doorway page is an old school black hat SEO technique. What webmasters would do is buy domains with high PR or buy expired domains that used to be competitors and then 301 redirect them back to their website. This was in essence buying their links, as the links to the old domains now ended up at their domain.
Are your domains all on the same hosting account or same serer c-block? Are they all registered and verified with Google Webmaster Tools? If not, then Google may seem them as being owned by different people. In that case, it would look to them like you just bought a bunch of domains and redirected them all to your domain.
To you, you were simply finding all the duplicate content out there and consolidating it into one domain the way you think you should. It just didn't look that way to Google. I would recommend claiming and verifying every one of the domains you want to 301 in GWT. Once you have them verified, then redirect them all to your new domain. At that point, file a reconsideration request with Google, explain your situation, show how you have all the domains verified and that they belong to you, and you should end up okay.
My best guess based on what you're saying is that Google thought all of your domains were under separate ownership, and to see them all 301 all at once looks like you just bought a bunch of other domains and redirected them to yours.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Reduce the spam score of the website?
Hello, My Website Spam Score is 60. how can I reduce the spam score of the website. plz give me suggestion
White Hat / Black Hat SEO | | jyhkgkkkkkhkjgj0 -
More than 450 Pages Created by a hacker
Hi Moz Community, I am in charge of the Spanish SEO for an international company, related to security. A couple of months ago, I realized that my Spanish/keywords/post all vanished from Google, Yahoo, Bing and Duckduckgo. Then I noticed that somebody in command of the main website used a disavow all! I was in shock, as all of you can imagine. Knowing that all the inbound links were spam score under 4, highly relevant and so. Later on, I was informed that the website was hacked and somebody took that action. Of course, it did not solved the issue. I continue researching and found those pages - "Online%20Games%20-%20Should%20Parents%20Worry%20Or%20Celebrate%3F" - all of them like this one. I informed the owner of the website - he is not my client - my client is the Spanish Manager. They erased the pages, of course plus sent all those, to avoid the 404 responses, to the homepage with a 301. My heart stopped at that point! I asked them to send all those with a redirect 301 to a new hidden page with nofollow and noindex directives. We recover, my keywords/pages are in the first page again. Although the DA fell 7 points and no inbound links for now. I asked for the disavow file "to rewrite it", not received yet. Any better ideas? Encountered a similar issue? How did you solved it?
White Hat / Black Hat SEO | | Mª Verónica B.
Thanks in advance.0 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Strategies to recover from a Google Penalty?
2 years ago we took over a client who had a hacked site and also had signed up with a black hat SEO team that set up 50 spammy directory links back to the site. Since then we have cleaned up the hacks, had the site reviewed by Google and readded to the Search Index, and disavowed all the directory links through GWT. Over the last 2 years, we've encouraged the client to create new content and have developed a small but engaged social following. The website is www.fishtalesoutfitting.com/. The site's domain authority is 30, but it struggles to rank higher than 20 for even uncompetitive long tail keywords. Other sites with much lower domain authorities outrank the site for our primary keywords. We are now overhauling the site design and content. We are considering creating an entirely new URL for the primary domain. We would then use 301 redirects from the old url to the new. We'd welcome insight into why the current site may still be getting penalized, as well as thoughts on our strategy or other recommendations to recover from the events of 2 years ago. Thank you.
White Hat / Black Hat SEO | | mlwilmore0 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0