Massive drop in Google traffic after upping pagecount 8-fold.
-
I run a book recommendation site -- Flashlight Worthy.
It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage".
It's been online for 4+ years.
Historically, it's been made up of:
-
a single home page
-
~50 "category" pages, and
-
~425 "book list" pages.
(That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.)
On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight.
If an Author has more than one book on the site, the page shows every book they have on the site, such as this page:
http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805
..but the vast majority of these author pages have just one book listed, such as this page:
http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116
Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries.
And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google.
(Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...)
Here's the problem:
For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable.
And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today.
And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem...
So:
1. Do you think the drop is related to my upping my pagecount 8-fold overnight?
2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority).
3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors.
What else?
Thanks so much, help is very appreciated.
Peter
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. -
-
Thanks for updating on your findings. That is interesting, but glad you got it sorted.
-
And now another update. About 1 week after removing all the new content, search traffic came right back to where it was. So clearly Google was mad at me. And now they're not. Sigh. Stupid Google.
-
UPDATE: I've removed all the new pages from my site in hopes that it will turn around my losss is search traffic. I'd still like an expert opinion on the matter in general.
-
Indeed, I looked at Webmaster Tools -- no duplicates.
As far as Canonical, while I know and love that feature, I don't think it's relevant here. These pages aren't different URLs for the same content -- they're segments of content taken from different pages, stitched together in a new and useful way.
I think, if this is the problem, that it's the fact that 95% of the new pages only have 1 item of content on them and it's a piece of content that appears elsewhere on the site.
-
Hi Peter
I agree Matt Cutts wasn't very clear as providing a solid number, but I actually consider what he said about relativity. "..if your site was 1 day .. um you know nothing, then the next day there is 4 million pages in our index" seems to me like he was hinting a percentage rather then a hard number. In your case you increased your site by over a 1000% with no new content.
From a useability standpoint it maybe awesome, from an SEO standpoint it may not. I can't say for sure the best way to handle it, but if it was me I would not throw away the benefit to my users, I instead would look to see if I can canonicalize any of these pages to prevent lower the burden on Google to try and differentiate one page from another.
Have looked at your Google Webmaster Tools to see if they are seeing some pages as duplicates?
-
Don, thatnks for replying. In answer to your questions:
-- Yes we added all the pages to the sitemap.
--As far as the content being unique, no -- not one word on any of the pages is unique. But the aggregation of the information onto those pages is unique and helpful to the end user. For example, say you had a site full of movies that won Oscars -- winners of 2010, all movies that won Best Director, all movies that won best Music, etc. Now imagine you'd like to see all the Tom Hanks movies that have won Oscars. There are a number of Tom Hanks movies scattered across the lists but there's no easy way to see them all at once. So generating a list of Tom Hanks movies that won Oscars is easy and useful. Only problem is, about 95% of the time when you generate such lists, you'll generate them for actors that were only in 1 Oscar-winning movie... hence a bunch of pages that are of little use. But why would that hurt traffic to all the pages that HAVE been of use for the last several years?
That Matt Cutts video was interesting... but I'm not sure if there's a clear answer there. he said 100+ pages at once is fine. But 10,000... maybe not. So what about 4,500?
-
Hi Peter,
According to Matt Cutts as long as the content is quality / good / unique you should not have been dinged.
You watch his answer to a very similar question on youtube here.
Now what is interesting is you went from 500 pages to 4000 pages. That is a huge update in terms of what your site has been offering so there maybe something going on there.
Did you submit all these page in a sitemap to Google? and by nature of these pages was the content unique or snippets of the inner content?
I will add a story about our how I handled a similar situation and maybe give you something to ponder. We have an o-ring size look up section on our site, the urls being generated are dynamic and number in the thousands, due to the combination of sizes, materials, and hardness. I did not tell Google about these links in the sitemap, rather just put a link to 8 main materials in the sitemap and then let Google discover the dynamic urls on their own.
After 6 months I noticed that Google was actually treating many of the deep pages as duplicate content, so I used rel='canonical" to direct the juice to the top material pages. Our traffic and SERP ratings went up for these pages.
I tell that to illustrate what I learned, having more pages isn't always good, in my case a nitrile as568-001 oring page isn't that different from a nitrile as568-002 oring page, and while they are certainly different sizes you can find information on either one from the nitrile as568 page. The smart thing I did was not flooding Google with thousands of new pages, the dumb thing I did was not canonicalizing the deep pages to begin with.
I will be interested in what others have to say on this subject, and I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Drops 10,000 links since last index. Lets play detective.
One of the intriguing things about SEO is being able to reverse engineer your competitors rankings because all the technical information is available for those who know where to look. I recently looked at my Dashboard and saw that one of my competitors had dropped 10,000 links. The questions is why? Google algorthm change? Blackhat Penalty? Something else.? Here are the numbers, I am going to lieave my own clients site out because his numbers are pathetic. www.Leafly(dot)com 50.4k Links Down 10k www.thcfinder(dot)com 1,530 links Down 71 www.weedmaps(dot)com 64,000k links Up 1.5K Is it just me or is that a lot of links to loose over one indexing period?
White Hat / Black Hat SEO | | DavidMeshah0 -
HELP: What happened to my rankings? No warning from google how to know if i was penalised?
Hi Guys I have just completely a site re-design, I have 3 top level domains. I have no idea whats causing the drop in ranking. I have changed the title tags and meta tags to improve them and make them better, as the last ones weren't really doing us justice. But I see now it has actually dropped our main keyword. I read somewhere that i had to completed **site search **to check and I don't see our home page showing. I was ranking for the keyword: "online psychics" for over 4months at #6 and now is not showing anywhere in the top 50 keywords. I'm also affraid I can not find our other keyword "online psychic readings" which we were ranked #11 seems to have dropped to #44 I have no idea why this would be the case. Our new home page shows a better user experience and also added more content, unqiue content at that - our last design was content thin so I have no idea why we have dropped so much in rankings. The site is also new about 6months new. I have checked WMT and have not received any warnings of any penalties as such, unless it is still coming? Does anyone have any suggestions here? Cheers
White Hat / Black Hat SEO | | edward-may1 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
My ranking dropped 3 pages on 18 november 2012
Hi There my site ranking dropped suddenly today for my main keywords such as security companies in london and security services in london from first page to 4th-5th page. these keywords were ranked on homepage http://www.armstrongsecurity.co.uk/ other keywords from some internal pages, such as this one http://www.armstrongsecurity.co.uk/security-services/event-security-london.html theygot hit slightly and went couple of listings down the road for event security london, event security companies london as well. same slight hitting happened on this page for main keywords http://www.armstrongsecurity.co.uk/bodyguard-for-hire-london.html can anyone help me, how to get the rankings back? my site authority is around 60 which is far better than most sites ranking higher than me now. these are some problems that i understand so far. keyword rich anchor text link profile for my main keywords over optimised pages let me know if anything you might find suspicious on my site that i can fix either on site or in my link profile. looking forward to your help. thanks gill
White Hat / Black Hat SEO | | spciuk0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Link quality warning from GWT and drop in keyword ranking.
So last December we saw our hard work pay off as our Panda penalty was lifted and our traffic shot back up to pre-Panda levels. Then in February we received this note: We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. Since December we've lost position on 80% of our top 100 keywords. I've gone through our links and can't figure out what the problem may be. Maybe I'm not using OSE properly. We don't buy links so I'm not sure what the problem is. If someone can walk me through using OSE to see what the problem may be I would appreciate it. Our domain is http://bit.ly/rbkYkp
White Hat / Black Hat SEO | | IanTheScot0 -
Big Rank Drop - Is My Site Spammy?
Like many others one of our niche sites - aluminumeyewear.com got slammed in the recent algo updates (4/18). All of our pages dropped at least 40/50 places which seems like a penalty to me. The site still ranks for its name thankfully. I'm trying to figure out if this is an over-optimization penalty, or a devaluing of back links or both. I would be grateful if I could get some feedback as to whether you feel the site is over optimized and how I could check if sources of back links have been penalized which in turn has effected us? Thanks in advance!
White Hat / Black Hat SEO | | smckenzie750 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0