Farmer Update Case Study. Please question my logic here. (Very long!)
-
Hi SEOmoz community!
I would like to try to give a small (well...) case study of a Farmer victim and some logical conclusions of mine that you are more then welcome to shred to pieces.
So, I run MANY sites ranging from low to super quality and actually have a few that have been hit by farmer but this particular site had me scratching my head as to why it was torched.
Quick background: Sitei s in a very competetive niche, been around since 2004 initially as a forum site but from 2005 also a content driven site. Site is an affiliate site and has been ranking top 5 for many high-value commercial KW's and has a big long-tail of informational kw's. Limk profile is a mix between natural, good links and purchased links from various qualilty sources.
Content is high quality written articles, how-to's, blog posts etc. by in-house pro writers plus UGC from a semi active forum (20-30 posts a day).
Farmer: After Farmer, this site's vertical is pretty much same as before with the biggest exception being my site. I quickly discounted low-quality content (spider-food) and focused instead on technical reasons. I took this approach since this site isn't the most well kept site I have and I figured the crappy CMS + PHPBB might have caused isseus.
I didn't want to waste my time crawling the site myself so I quickly downloaded all the URLs that Majestic had crawled. Too my surprise the result of Majestic's crawler was over 3 million URLs when the real number would likley be 30-40k and Google has about 20k indexed.
After scanning through the file with URLs I knew I had issues. Massive amounts of auto-generated dupe pages from the forum and so on. By adding around 20 new lines to robots.txt I was able to block millions of pages from being crawled again.
My logic: Ok, so now I think I've found what caused the drop. Milllions of dupe pages and empty pages could have tripped the Farmer algo update to think the site is low quality or dupe or just trying to feed the spiders with uselessness.
My WEAK point in this logic is that I can't prove that Google even knew about (or smart enough to ignore them). Google WMT tells me they've crawled an average of around 10k pages the last 90 days. Given this I'm doubting my logic and if I've found the issue or not.
My next step is to see if this gets resolved algorithmically or not, if not i feel I have a legitimate case to submit a reinclusion request but i'm not sure?
Since I haven't been a contributing member to this community I'm not looking to get direct help with my site, but hopefully this could spark some discussion about Farmer and maybe some flaming of my logic regarding the update
So, would any of you have drawn similar conclusions as I did? (Sweet blog bro!)
-
Good to hear that all your rankings have recovered. How have things gone the past couple of weeks, do you have anything you can share here? Or maybe even for a YOUmoz post?
-
The site is back, all rankings recovered.
-
Hi Barry,
Thanks for bringing up some great points!
I probably should have talked about the effects the update had on the site in my op Well, it was a drastic drop in rankings for pretty much everything the site ranked for, at least 20-40 positions drop acroos the board EXCEPT for the most commercial/highest volume KW's that only resulted in a 3-6 position drop (still top 10 for the most part.
The landing pages, that dropped the least, for the commercial queries is obv. where all the paid links go to (mostly from sites in my niche that are still doing fine). The informational pages that got hit the worst only ranks because of natural links from great/good sites that were not affected by farmer and domain authority.
I really see your and Dan's point about Google not caring about the millions of urls but i can't shake the feeling that it might have tripped the wire somehow
FTR, I'm not complaining about my situation, just generally surprised about this site getting hit when I have a ton of sites that deserved to be torched when I feel this site is actually "clean". I think this is why my logic seems strange to you
Anyway, another great reply, thanks!
-
What's actually changed for you though? Have you looked in the analytics to see what pages are no longer bringing traffic, what keywords are no longer bringing you traffic, that kind of thing?
Is it across all pages and all keywords or is it just a few high traffic keywords and pages?
Just because your niche has many link buyers doesn't mean that you're not getting penalised for it
I don't think Google will have known or cared about those millions of pages, I assume none of them have shown up as a landing page for visitors so I would guess they were effectively invisible to Google.
Paid links (and indeed normal links), you may not be getting penalised for them, but if some of your highest value links are themselves being punished or in some way devalued you may be losing out there as well.
I asume no other significant change occured at this time?
-
Hey Dan,
Thanks for replying! I figured the purchased links would come up but I have pretty much discounted that since the niche is crowded with link buyers and no-one got hit. I'm also active in other verticals where lik buyers prosper and I haven't seen any impact on just about any of them.
In comparison, my sites link profile is pretty vanilla compared to many other sites. That said, I know I can't discount the links 100% as being the reason here since I've been paying for them.
Really appreciate you taking the time to reply!
-
My first knee jerk response to this post is to target the part about purchased links. If you have paid links, then I would remedy that problem before I look at anything else.
Do you have a lot of adsense on this site? I've been hearing left and right that a lot of sites that got hit hard were those with 5 or 6 adsense units on one page. Excessive ads drive users crazy, so Google could be torching you for that.
As for the auto generate dupe content pages, Google may or may not have found them. Were there links to these pages? Do you know how Majestic found them?
If you never linked to these pages it is unlikely Google ever found them. Google tends to only crawl content with links or found in a sitemap. If you never had links to those dupe pages and they weren't in your sitemap, I doubt it is causing the problem. Plus if that were the issue, you probably would have been torched long before this algo update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Can I get updated opinions on PR Web?
I saw Moz has discussed PR web in earlier posts, but they are mostly months to years old. I'm wondering if PR Web is a good service? A lot of my competitors use it, but it seems just like a paid link to me. If for whatever reason, PR Web is an approved loophole, does anyone have any suggestions on which plan to purchase? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Manual Penalty Question
Hello dear MoZ community, I have already communicated this problem before but now it reaches to a level I have to make some hard decisions and would like your help. One of our new accounts (1 month old) got a manual penalty notification few weeks ago from Google for unnatural link building. I went through the whole process, did link detox and analysis and indeed there were lots of blog networks existing purely for cross linking. I removed these and the links got decreased dramatically. The company had around 250,000 links and truth be told if I go by the book only 700-800 of them are really worth and provide value. They will end up with roughly 15000 -20000 left which I acknowledge are a lot but some are coming from web 2 properties such as blogger, wordpress etc. Because the penalty was in some of the pages and not the whole web site I removed the ones that I identified were harming the web site, brought the anchor text down to normal levels and filed a very detailed reconsideration request and disavow file. I do not have a response so far by webmasters but here is where my concerns begin: Should I go for a new domain? losing 230.000 links ? How can there even be a "reconsideration" request for a web site with 85% of its link profile being cross linking to self owned directories and web 2 properties? If I go for a new domain should I redirect? Should I keep the domain, keep cleaning and adding new quality links so I take it with a fresh new approach? Thanks everyone in advance!
White Hat / Black Hat SEO | | artdivision0 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Guest Blogging question
Once your guest posts go live do you do anything to promote them or do you just wait and hope they get indexed? If so what do you do exactly?
White Hat / Black Hat SEO | | RonMedlin0 -
We seem to have been hit by the penguin update can someone please help?
HiOur website www.wholesaleclearance.co.uk has been hit by the penguin update, I'm not a SEO expert and when I first started my SEO got court up buying blog links, that was about 2 years ago and since them and worked really hard to get good manual links.Does anyone know of a way to dig out any bad links so I can get them removed, any software that will give me a list of any of you guys want to do take a look for me? I'm willing to pay for the work.Kind RegardsKarl.
White Hat / Black Hat SEO | | wcuk0 -
Does anyone know of a good link building case study? A B2B focus would be a plus
Looking for a solid analysis of a white-hat campaign that showed tangible results (if one exists).
White Hat / Black Hat SEO | | RiseSEO0