My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
-
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains.
On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues.
When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys.
We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight.
I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/"
It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong.
I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty.
Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down?
We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content.
The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects.
Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem.
I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem!
It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content.
As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
-
The domains in question were all previously owned by me in my webmaster tools account long before this happened. I've since gone and put in an address change request for the site that has the 301s on it to point to the new site.
I'm feeling like I got stuck with a false positive here, but it is taking forever to get re-reviewed. Of course, it is grilling season now, so I'm losing tens of thousands of dollars in revenue per day that we are out of the index.
I realize the answer is probably no, but does anyone have any tips on how to speed up the review process? I could lose a quarter million dollars over the course of a week or two.
-
A doorway page is an old school black hat SEO technique. What webmasters would do is buy domains with high PR or buy expired domains that used to be competitors and then 301 redirect them back to their website. This was in essence buying their links, as the links to the old domains now ended up at their domain.
Are your domains all on the same hosting account or same serer c-block? Are they all registered and verified with Google Webmaster Tools? If not, then Google may seem them as being owned by different people. In that case, it would look to them like you just bought a bunch of domains and redirected them all to your domain.
To you, you were simply finding all the duplicate content out there and consolidating it into one domain the way you think you should. It just didn't look that way to Google. I would recommend claiming and verifying every one of the domains you want to 301 in GWT. Once you have them verified, then redirect them all to your new domain. At that point, file a reconsideration request with Google, explain your situation, show how you have all the domains verified and that they belong to you, and you should end up okay.
My best guess based on what you're saying is that Google thought all of your domains were under separate ownership, and to see them all 301 all at once looks like you just bought a bunch of other domains and redirected them to yours.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long before our website bounce back after Google Penalty?
One of our client websites got recently hacked. In a span of 4 days, it received random backlinks from random websites with random anchor texts. We are already in good standing for some of the keywords we are tracking and the attack got us a penalty from Google and we lost our rankings, moving out of the top 500. We already disavowed these dirty backlinks though we never really diagnosed where these came from. How long do you think our client's website will bounce back from the penalty?
White Hat / Black Hat SEO | | SirAdri110 -
I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it. any advice would be much appreciated!
White Hat / Black Hat SEO | | jfishe19881 -
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Ajax Pagination on Ecommerce category pages - Good or Bad?
We have an ecommerce site. We installed an AJAX feature that when you scroll down to say, the end of 6 rows of products, it loads another page below the seam. Question is, is this good or bad for SEO? Any tests you can suggest? Thanks Ben
White Hat / Black Hat SEO | | bjs20100 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30