Panda recovery timeframe question
-
Site was hit by Panda Aug. 22nd. Lost 90% of Google traffic. I know
We think we found a reason and made few changes to landing pages structure. Updated sitemaps submitted. When can we expect effect (if any) - few days or after next Panda data refresh? Thank you!P.S. What is also interesting, similar traffic loss from Bing/Yahoo happened at exactly the same date. Does that mean Bing is "stealing" search results from Google when can't provide their own relevant results?
-
Out of total 365 days in a year both things happened at exactly Aug. 22nd. Hardly a coincidence.
Knowing that Bing/Yahoo indexed 100 times less of our pages than Google did it is naturally normal to assume that the reason may be in some out-of-website factors?
-
P.S. What is also interesting, similar traffic loss from Bing/Yahoo happened at exactly the same date. Does that mean Bing is "stealing" search results from Google when can't provide their own relevant results?
This is highly unlikely. We were hit with Panda, but our Yahoo/Bing positions were relatively unchanged. Most likely a coincidence.
-
There was a Panda update on Aug 20, so you definitely could have been affected around that time. Generally, there are Panda updates about once a month (usually every 4-6 weeks). If you've cleaned up well, then when there is another update (perhaps end of September) you should recover.
Be sure to have a good look for duplicate content issues and any pages with duplicate content on it as well.
It's a little bit strange that Bing and Yahoo dropped as well....makes me wonder if there was something other than Panda going on. Check out this article for other reasons for your traffic to drop.
Good luck!
-
Ah, thank you. Of course. Corrected: happened Aug. 22nd.
-
panda was not last week...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forced Redirects/HTTP<>HTTPS 301 Question
Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!
Technical SEO | | ogiovetti0 -
Migrating to New site keywords question
We are converting an old static html ecommerce site to a new platform. The old site has excellent ranking for some of the products. In order to maintain our ranking we will implement 301 redirects from old to new pages (as the urls will change to SEF). I am using Googles Keyword tool (in adwords) and entering each page url of the old site (there are hundreds, I'm doing the top 50 in traffic) and generating a set of keywords, then sorting each list by global searches. For each page, Google's Keyword Tool is giving me hundreds of keywords, but in meta tags there should be no more than 15, so I need a method to choose the keywords on the new page. Question: in the new meta tags should we emphasize the most common keywords (as defined by most global searches) or the least common keywords? I would hate to lose the good ranking for the least common (long tail) keywords.
Technical SEO | | ssaltman0 -
Moving articles to new site, can't 301 redirect because of panda
I have a site that is high quality, but was hit by penguin and perhaps panda. I want to remove some of the articles from my old site and put them on my new site. I know I can't 301 redirect them because I will be passing on the bad google vibes. So instead, I was thinking of redirecting the old articles to a page on the old site which explains that the article is moved over to the new site. I assume that's okay? I'm wondering how long I should wait between the time I take them down from the old site to the time I repost them on the new site. Do I need to wait for Google to de-index them in order to not be considered duplicate content/syndication? We'll probably reword them a bit, too - we really want to avoid panda. Thanks!
Technical SEO | | philray
Phil0 -
Domains and Hosting Question
I bought hosting for unlimited domains on Godaddy. It's not a dedicated server. It was just $85 a year. I have unlimited latency but a limited amount of "space." I don't know a lot about hosting servers etc... My question is relatively simple. When I go in GoDaddy to my hosting. There is a site that shows up as hosted, and all of the other sites show up under that site in it's directory. If you type the name of the site I bought the hosted package on, then type a forward slash and the name of one of the other sites on the hosting package, you will actually go to the other website. What is this relationship? Is it normal? Does that make all of my websites subdomains of the main site (that I bought the hosting package on)? I don't fully comprehend how this effects everything...
Technical SEO | | JML11790 -
Another http vs https Question?
Is it better to keep the Transaction/ Payment pages on a commercial website as the only secure ones (https) and remainder of website as http? Or is it better to have all the commercial website as secure (https)?
Technical SEO | | sherohass0 -
Question about Robot.txt
I just started my own e-commerce website and I hosted it to one of the popular e-commerce platform Pinnacle Cart. It has a lot of functions like, page sorting, mobile website, etc. After adjusting the URL parameters in Google webmaster last 3 weeks ago, I still get the same duplicate errors on meta titles and descriptions based from Google Crawl and SEOMOZ crawl. I am not sure if I made a mistake of choosing pinnacle cart because it is not that flexible in terms of editing the core website pages. There is now way to adjust the canonical, to insert robot.txt on every pages etc. however it has a function to submit just one page of robot.txt. and edit the .htcaccess. The website pages is in PHP format. For example this URL: www.mycompany.com has a duplicate title and description with www.mycompany.com/site-map.html (there is no way of editing the title and description of my sitemap) Another error is www.mycompany.com has a duplicate title and description with http://www.mycompany.com/brands?url=brands Is it possible to exclude those website with "url=" and my "sitemap.html" in the robot.txt? or the URL parameters from Google is enough and it just takes a lot of time. Can somebody help me on the format of Robot.txt. Please? thanks
Technical SEO | | paumer800 -
How ehow beat Panda?
Looks although ehow was hit by Panda 2.0, its traffic has increased back to previous levels. Does anyone know of an article / study that goews over what ehow did.
Technical SEO | | nicole.healthline0 -
De-indexing thin content & Panda--any advantage to immediate de-indexing?
We added the nonidex, follow tag to our site about a week ago on several hundred URLs, and they are still in Google's index. I know de-indexing takes time, but I am wondering if having those URLs in the index will continue to "pandalize" the site. Would it be better to use the URL removal request? Or, should we just wait for the noindex tags to remove the URLs from the index?
Technical SEO | | nicole.healthline0