Is this traffic drop do to cutting backlinks or Penguin 2.0 (Graphs attached)
-
I've attached both graphs of the traffic drop.
Our website rankings have been steadily declining since May of 2013.
We have mostly return customers or our drop would have been much more severe.
There's never been any warnings in GWT
We cut a bunch (but not all) of our paid links in May of 2013. We didn't have a manual penalty or anything, we just wanted to see what happened if we moved towards being white hat. When our rankings plumited, we quit cutting links. We currently have about 30% paid links.
Penguin 2.0 was May 22, 2013
In looking at these graphs, was it our cutting links that caused the traffic drop, or was it Penguin 2.0? I'm looking for people who have experience in diagnosing a "Unique Visits" Google analytics graph for Penguin and have experience with what happens when you cut links.
It looks like, in viewing the graphs, that May 23 was more the day that the big drop happened, but you guys have more experience with this than me.
Thank you.
-
Locking the question and directing people back to the earlier posts.
-
I'm going to refer you back to the other two questions you asked today about the same thing with the same graphs that already have a bunch of answers in them.
http://moz.com/community/q/what-penalty-would-cause-this-traffic-drop-google-analytic-screenshot
http://moz.com/community/q/does-this-graph-look-like-a-panda-2-0-hit
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Assessing the true value of a backlink
I want to start a discussion about assessing the true value of a backlink. Here's a scenario: I've just started working on SEO for a new client. Once I've got the strategy stuff out of the way, I like to start by looking at backlinks that competitors have. I use Moz OSE (and other tools) and filter by followed links to the root domain. This gives a good starting sense of where competitors are getting links from. As I start to explore those links, I see some black-hat (or grey-hat) practices at play: display:none links, footer links, sidebar links, comment spam, etc. The problem I have is, there seems to be no way of knowing whether or not those links are responsible for boosting the competitors rankings. They come from sites that have good DA and PA, yet we're told that tactics like display:none and comment spam will either get those links devalued or may cause some sort of manual action. My question is, how do others evaluate the full spectrum of the value a link has that goes beyond trust, authority, and citation flow?
White Hat / Black Hat SEO | | SEMbyotic2 -
Tool to check google index status for backlinks?
I would like to check to see which backlink urls are indexed in Google. Is there a tool that can automate this work or will I have to do it manually?
White Hat / Black Hat SEO | | Choice0 -
Ever seen this tactic when trying to get rid of bad backlinks?
I'm trying to get rid of a Google penalty, but one of the URLS is particularly bizarre. Here's the penalized site: http://www.travelexinsurance.com. One of the external links Google cited as not being natural that links to the penalized site is: http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 In the backlink profile of the penalized site, there are about 100 different backlinks pointing to www.travelexinsurance.com from content.onlineagency.com/... So when I visit http://content.onlineagency.com/index.aspx?site=6599&tide=769006&last=3111516 it actually is displaying content from http://www.starmandstravel.com/787115_6599.htm, which you can see after clicking the "Home" button. That company is a legit travel agency who I assume knows nothing about content.onlineagency.com and is not involved in whatever is going on. And that's the case for every link from content.onlineagency.com. So I'm just wondering if someone can help me understand what sort of tactic content.onlineagency.com is using. One of my predecessors I fear used some black hat tactics. I'm wondering if this is a remnant of that effort.
White Hat / Black Hat SEO | | Patrick_G0 -
Unique meta descriptions for 2/3 of it, but then identical ending?
I'm working on an eCommerce site and had a question about my meta descriptions. I'm creating unique meta descriptions for each category and subcategory, but I'm thinking of adding the same ending to it. For example: "Unique descriptions, blah blah blah. Free Overnight Shipping..". So the "Free Overnight Shipping..." ending would be on all the categories. It's an ongoing promo so I feel it's important to add and attract buyers, but don't want to screw up with duplicate content. Any suggestions? Thanks for your feedback!
White Hat / Black Hat SEO | | jeffbstratton0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Does this graph look like a Penguin 2.0 hit?
Hello,Does the attached graph look like a Penguin 2.0 hit? Keep in mind that on our eCommerce site most purchases are from return customers. I forgot to add here that we cut a bunch of paid links in May 2013 as well. We quit cutting paid links when our rankings dropped - we thought it was the paid links. We currently have 30% paid links. Penguin 2.0 was on May 22. ga2.png
White Hat / Black Hat SEO | | BobGW0 -
Got dropped on Google rank - Tips to discover why please
Hi guys originally my website was poor ranked on Google. So, after sign in on Moz and follow their tips I achieved the 4th position for one of my keywords (amazing!). But a few days ago my page dropped to bellow the first 50th pages for this same keyword, but I didn't make any changes on it. Anybody has some tips of how can I discover/repair what happened? Thank you all in advance. Best regards Paulo
White Hat / Black Hat SEO | | phlcastro0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0