Massive drop in Google traffic after upping pagecount 8-fold.
-
I run a book recommendation site -- Flashlight Worthy.
It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage".
It's been online for 4+ years.
Historically, it's been made up of:
-
a single home page
-
~50 "category" pages, and
-
~425 "book list" pages.
(That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.)
On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight.
If an Author has more than one book on the site, the page shows every book they have on the site, such as this page:
http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805
..but the vast majority of these author pages have just one book listed, such as this page:
http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116
Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries.
And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google.
(Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...)
Here's the problem:
For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable.
And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today.
And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem...
So:
1. Do you think the drop is related to my upping my pagecount 8-fold overnight?
2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority).
3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors.
What else?
Thanks so much, help is very appreciated.
Peter
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. -
-
Thanks for updating on your findings. That is interesting, but glad you got it sorted.
-
And now another update. About 1 week after removing all the new content, search traffic came right back to where it was. So clearly Google was mad at me. And now they're not. Sigh. Stupid Google.
-
UPDATE: I've removed all the new pages from my site in hopes that it will turn around my losss is search traffic. I'd still like an expert opinion on the matter in general.
-
Indeed, I looked at Webmaster Tools -- no duplicates.
As far as Canonical, while I know and love that feature, I don't think it's relevant here. These pages aren't different URLs for the same content -- they're segments of content taken from different pages, stitched together in a new and useful way.
I think, if this is the problem, that it's the fact that 95% of the new pages only have 1 item of content on them and it's a piece of content that appears elsewhere on the site.
-
Hi Peter
I agree Matt Cutts wasn't very clear as providing a solid number, but I actually consider what he said about relativity. "..if your site was 1 day .. um you know nothing, then the next day there is 4 million pages in our index" seems to me like he was hinting a percentage rather then a hard number. In your case you increased your site by over a 1000% with no new content.
From a useability standpoint it maybe awesome, from an SEO standpoint it may not. I can't say for sure the best way to handle it, but if it was me I would not throw away the benefit to my users, I instead would look to see if I can canonicalize any of these pages to prevent lower the burden on Google to try and differentiate one page from another.
Have looked at your Google Webmaster Tools to see if they are seeing some pages as duplicates?
-
Don, thatnks for replying. In answer to your questions:
-- Yes we added all the pages to the sitemap.
--As far as the content being unique, no -- not one word on any of the pages is unique. But the aggregation of the information onto those pages is unique and helpful to the end user. For example, say you had a site full of movies that won Oscars -- winners of 2010, all movies that won Best Director, all movies that won best Music, etc. Now imagine you'd like to see all the Tom Hanks movies that have won Oscars. There are a number of Tom Hanks movies scattered across the lists but there's no easy way to see them all at once. So generating a list of Tom Hanks movies that won Oscars is easy and useful. Only problem is, about 95% of the time when you generate such lists, you'll generate them for actors that were only in 1 Oscar-winning movie... hence a bunch of pages that are of little use. But why would that hurt traffic to all the pages that HAVE been of use for the last several years?
That Matt Cutts video was interesting... but I'm not sure if there's a clear answer there. he said 100+ pages at once is fine. But 10,000... maybe not. So what about 4,500?
-
Hi Peter,
According to Matt Cutts as long as the content is quality / good / unique you should not have been dinged.
You watch his answer to a very similar question on youtube here.
Now what is interesting is you went from 500 pages to 4000 pages. That is a huge update in terms of what your site has been offering so there maybe something going on there.
Did you submit all these page in a sitemap to Google? and by nature of these pages was the content unique or snippets of the inner content?
I will add a story about our how I handled a similar situation and maybe give you something to ponder. We have an o-ring size look up section on our site, the urls being generated are dynamic and number in the thousands, due to the combination of sizes, materials, and hardness. I did not tell Google about these links in the sitemap, rather just put a link to 8 main materials in the sitemap and then let Google discover the dynamic urls on their own.
After 6 months I noticed that Google was actually treating many of the deep pages as duplicate content, so I used rel='canonical" to direct the juice to the top material pages. Our traffic and SERP ratings went up for these pages.
I tell that to illustrate what I learned, having more pages isn't always good, in my case a nitrile as568-001 oring page isn't that different from a nitrile as568-002 oring page, and while they are certainly different sizes you can find information on either one from the nitrile as568 page. The smart thing I did was not flooding Google with thousands of new pages, the dumb thing I did was not canonicalizing the deep pages to begin with.
I will be interested in what others have to say on this subject, and I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Substantial drop in organic traffic and keyword rankings
My client's organic traffic has been on the decline ever since January of 2015. We suspected it had to do with some spammy link farm pointing to his site. We disavowed those links in August 2015. Still, we are seeing huge drop offs in rankings and organic traffic. I am at a loss of what to do. Are we being penalized by Google for some reason? Has this happened to anyone else? If so, how did you remedy? Feel free to ask my any more questions if you more information. KDc8dMp fyVtrYo
White Hat / Black Hat SEO | | kheberger0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Has google done well with these search results?
I am struggling to grasp the new logic behind google, my understanding was that they wanted to return more related searches so that the search matched the results giving people exactly what they are looking for from trusted suppliers. However I work in the vacation rental niche and I have found that the individual long tail searches have started to become less valuable as they are no longer giving the exact property. Here is a screenshot of the top 10 results for the key phrase "10 bedroom villas in quinta do lago" Position 1 & 2 are good results and would be expected however the next 7 positions are completely not related to the search, yes it is quinta do lago. But I am looking specifically for a 10 bedroom villa, none of these pages offer 10 bedroom villas. I actually found my listing outside the top 20 and mine is a 10 bedroom villa in quinta do lago. Does anyone have anything that can enlighten me on this? Thanks Andy 0bqdRJi
White Hat / Black Hat SEO | | iprosoftware0 -
Google Disavow and Penalty lifted please help?
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob
White Hat / Black Hat SEO | | BobAnderson0 -
Rankings dropped, should I start a new website?
Hello, my rankings dropped last year (penguin update) - I think it was April 2012 and the website went from 300 visitors per day to 10 per day. This probably happened because I bought links, but I also did a lot of manual and natural SEO (at that time). After the drop, I didn't know what to do... so I did some manual SEO, blog comments, forum posts, article publications (lets say 60 links in total - with diverse anchor texts - brand keywords, etc) and then I paused working on the site to see if there will be any changes... and 1 year latter, there are still no changes. My site used to be in the top results of the first page and now it is totally out of Google. http://getmoreyoutubeviews.com Should I move on and start a new website or do something to fix this one? Thanks Alex
White Hat / Black Hat SEO | | buysocialexposure0 -
Sudden Drop in Keyword Ranking - No Idea Why
Hi Mozzers, I am in charge of everything Web Optimization for the company I work for. I keep active track of our SEO/SEM practices, especially our keyword rankings. Prior to my arrival at the company, in January of this year, we had a consultant handling the SEO work and though they did a decent job on maintaining our rankings for a hefty set of keywords, they were unable to get a particular competitive keyword ranking. This is odd because other derivations of that keyword which are equally competitive are all still ranking on page one. Also, full disclosure, they were not engaging in any questionable linking. In fact, they didn't do much of any link building whatsoever. I also haven't been engaging in any questionable content creation or spammy linking. We put out content regularly as we are a publicly traded company - nothing spammy at all. Anyway, one thing I tried since February was engaging in a social media sharing campaign among friends and coworkers to share the respective page and keyword on their Facebook and Google+ pages. To my surprise, this tactic worked just like natural search usually does - slowly and through the months I saw the keyword rank from completely invisible, to page 6, to page 3, to page 2, and finally onto position 6 page one as of just last week. Today, unfortunately, the keyword is invisible again :(. I am perplexed. It's tough to build links for our company as we are in the public and everything we do has to be approved by someone higher up. I also checked our webmaster tools and haven't seen any notifications that can give me clue as to what's going on. I am aware that there was a Penguin update recently and there are monthly Panda updates, but I'm skeptical as to whether or not those updates would be correlated to this because, at initial glance, our traffic and rankings for other keywords and pages don't seem to be affected. Suggestions? Advice? Answers? Thanks!
White Hat / Black Hat SEO | | CSawatzky0 -
Is this website being punished by Google?
Hi, I just took over the SEO for a friend of mine's website. Is this website being punished by Google? It has a strong link score, the homepage needs work as far as Key wording goes but it does not appear in Google's top 100 for any keyword. I am not sure that the last SEO company did some harm. Can anyone give me some tips on getting my friend back into the mix? www.wallybuysell.com
White Hat / Black Hat SEO | | CKerr0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0