Is it panda, pengiun, ad penalty?
-
I'm trying to figure out why my google traffic is going down...
I see that back in Feb and then March 2011 it started to drop, which I assume was pengiun.
I saw a gradual comeback in traffic until March 2012, which I assume the second drop was another pengiun update.
The decline continued gradually until I saw a big drop in October 2012 which is completely dropping off in the past month today.
I recreated my website on wordpress, improving content and removing google ads. Relaunched a few weeks ago and still see a big drop.
Any idea what happened? I only got a message from google about a large traffic drop in march 2012 and a 404 error increase recently when I launched the new site which I fixed with 301 and removing media attachment pages that were indexed that gave a 404.
Once concern is I have no idea if I have a problem with pengiun. Could I have a problem with too many links coming from my blog or soicial network? What's acceptable number of back links to not be spam? If you add pages in the blogroll is this thought of as spam with pengiun?
website: http://www.dashinfashion.com
Thanks for your help!
-
Thanks for the quick response....I'm starting to feel like in on the way to the zoo
I also posted this on the google webmaster forum and two issues came up:
-
to non-follow outgoing links (affiliate or ones that look paid)
-
big increase in backlinks since early march using this tool:
mainly from my network and blog (not sure why this happened, I did update links to new urls on new site). Does SEOMOZ have a tool to see the history of backlinks in a graph?
- increase in backlink issue brought up the recommendation to non-follow links from blog and network
So my big question is the non-follow outgoing link issue. I run a kids fashion magazine and have pages on each designer - approx 450. Most of these links are not paid, as it is a resource for our readers. So....my question is should rel="nofollow" most of these links, all of these links, only those that are an affiliate link? (My affiliate links will be set as redirects from an internal link).
Also, from external blog and network, should all of the links back to the main site be rel="nofollow". If not, is this considered spam / penguin?
Thanks so much!
-
-
Dash,
I am thinking this is Aardvark at play and not the other animals. OK, just messin' wit ya.
First, I suggest using SEOmoz's Algorithm Change History to check your dates. Feb/March 2011 certainly appears to be Panda/Farmer and given such a large percentage of queries were hit, you were likely in the group. So, just check the change in traffic to the updates and you will have clues.
Now, with the new site, 404 issues, etc., you may be wanting to regain lost ground a bit too quickly for reality. Here is a good article by Jason DeMers around "fixing" the issues, etc.
Next, focus on the content you have on the site and continuing to put newer and better content on. Stay away from too many overly optimized links with anchor text issues, etc.
I would certainly counsel that you focus on content instead of penguins, etc.
Best
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Added Schema and Rankings Went Down
Hello - We launched a schema plugin for our WordPress site to make our blog seen as articles and main page as an organization. The day after, we saw a dramatic decrease in Keyword rankings but our website health improved with Google. Any thoughts on what could be causing this?
Technical SEO | | Erin_IAN0 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Is Panda as aggressive as Penguin in terms of being able to escape its clutches ?
Hi, Is being hit by Panda as hard to get out of as being hit by Penguin ? Or if you clean up all your content should you get out of it relatively quickly ? I have a very old (11 years) and established site (but also very neglected site that i'm looking to relaunch) but its on an ancient shopping cart platform which never allowed for Google analytics & GWT integration etc so i cant see any messages in GWT or look at traffic figures to correlate a drop with any Panda updates. The reason i ask is i want to relaunch the site after bringing up to date with a modern e-commerce platform. I originally launched the site in early 2002 and was perceived well by Google achieving first field of view SERPS for all targeted keywords however competitive, including 'ipod accessories', 'data storage' etc etc. These top positions (& resulting sales) lasted until about 2007 when it was overtaken by bigger brand competitors with more advanced & Google friendlier ecommerce platforms (& big SEO budgets) I originally used the manufacturers descriptions editing slightly but probably not enough to avoid being considered duplicate content although still managed to obtain good rankings for these pages for a very long time even ranking ahead of Amazon in most cases. The site is still ranking well for some of the keywords relating to products for which there is still manufacturer copied descriptions so i actually don't think i have been hit by Panda. So my questions Is, is there any way of finding out for sure if the site has indeed even been hit by Panda at all without looking at analytics & gwt ? And once i find out if it has or not: Is it best if i relaunch on same domain to take advantage of the 11 year old domain history/authority etc ? So long as i make sure all product descriptions etc are unique, if i have been hit by Panda the site should escape its clutches quite quickly ? **OR ** Is Panda as aggressive as Penguin in which case is it best to start again on a new domain ? Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Would adding an SSL certificate help my website?
SSL certificates can obviously be a used as a ranking factor by Google, but would a site with no need for an SSL certificate notice a gain by adding one? Is it possible to demonstrate you have an SSL certificate without having some https pages on your site?
Technical SEO | | sthompson0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Does adding a YouTube video to a page decrease site speed?
If you embed a YouTube video on your page, does Google count that as part of their site speed calculation. Since it is in a iFrame, I would think that it is not counted.
Technical SEO | | ProjectLabs0 -
Dramatic Decrease in Google Organic Traffic Indicates a Penalty But None Found
So we've been having some difficulty with one of our websites since we split it in half and moved one section of content to a new domain with a new name, at the end of May. http://www.dialtosave.co.uk/mobile/ was moved to http://www.somobile.co.uk And in the following 6 weeks, the google organic traffic has fallen to miniscule levels, that seem to indicate a more serious issue than just low ranking. Initially when the site was moved, the 301s transferred the authority very quickly and the new website pages ranked well. Now, some of them simply won't rank at all unless you include the name of the website "somobile". Here is one of the current rankings that indicates an issue:
Technical SEO | | purpleindigo
"somobile" - 1
"somobile mobile phones" - not in top 50 These are some of the terms we used to rank in the top 10 on Google UK, and still do on Bing UK, but don't rank in the top 50 on Google UK now:
samsung galaxy ace
apple iphone 5 deals
samsung tocco icon Our webmaster central account says that only 30% of the pages in our sitemap are in the index. It seems like a penalty has been imposed, but our reconsideration request (just submitted because it seemed like a sensible next step) came back saying there were no manual actions taken. Can you see what it is that might be causing the problem for us? I would have thought it was the new domain (with less direct links and less brand credibility), or content issues, but I would have thought that would just reduce the ranking by a few pages rather than just hide the pages altogether.0