Panda Recovery Question
-
Dear Friends,
One of my customers was hit by the Panda, we were working on improve the tiny content on several pages and the remaining pages were:
1 NOINDEX/FOLLOW
2. Removed from sitemap.xml
3. Un-linked from the site (no one page on the site link to the pour content)
As conclusion we can't see any improvement, my question is should I remove the pour content pages (404)?
What is your recommendation?
Thank you for your time
Claudio
-
Thank you
-
Ivan, Panda is a page-level user experience algorithm. Ask yourself: do content pages that result in high bounce rates, low average site visits OR result in "pogosticking" (users click on the pages then immediately use the back button and return back to search results) REALLY qualify as quality pages? The answer is no, they don't.
I would urge you to visit the list of 23 questions I initially linked to above and ASK these of your current content options. Further, if you do visit this link, take a look at this quote from Amit Singhal on the same page:
"low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content."
Directly from the horse's mouth. Can't get any clearer than that.
-
Hi,is this really true about this: bounce rate of 100% OR an average visit of less than 30 seconds should be reviewed closely for complete removal from your site. Even a small amount of these type of pages can drag down an entire site algorithmically.
thank you
-
Dear Casey,
Thank you for your prompt response, I want to share with you he url http://goo.gl/4QBVjR please take a look an will be welcome all your feedback
Thank you
Claudio
-
Absolutely! Think of your site as a book. It used to be (pre-Panda) that adding new pages to your site was the right result. More pages, even low-quality pages, allowed for your site to better trigger long tail keywords which generated more traffic. This traffic may not have been super-targeted though and tended to generate very high bounce rates.
Now, post-Panda, it's clear that even a SMALL amount of low-quality, thin, or poor user experience content will drag down your entire domain. That's how Panda works -- it's a page-level quality algorithm. So pruning or removing that content is definitely a consideration to which you must give serious thought. Ask yourself: does your client's content answer a question, fulfill a need, or provides a unique viewpoint all of which work together to provide a full quality user experience? If not, then either re-write (usually a complete waste of time) or remove it completely from your site.
When Google pushed out Panda waaaaay back in 2011 they published a list of 23 questions that site owners should be asking themselves when auditing their site for content and user experience. Read this list and take a hard look at your site and content practices with an eye to understanding how Google may see your site.
Then, I'd suggest you go into Google Analytics under Behavior, choose Site Content, then All Pages, and then sort that content by Bounce Rate. Any page that has a bounce rate of 100% OR an average visit of less than 30 seconds should be reviewed closely for complete removal from your site. Even a small amount of these type of pages can drag down an entire site algorithmically.
Finally, if you do remove the pages from your site, I'd suggest a 410 GONE status code. These seem to be processed much faster than regular 404s and it's a clear sign to Google that these pages are NEVER coming back!
I hope this was helpful Claudio. Good luck with your client's site!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Company Rebranded - Domain/Page Authority Question
Our company rebranded, our old domain has pretty good PA and DA. Any way to utilize our old domain to boost PA/DA of our new domain? PS - Both are hosted on the same host (same IP).
Intermediate & Advanced SEO | | idlwebinc0 -
Sites in multiple countries using same content question
Hey Moz, I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain? The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me. I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately? www.example.com www.example.co.uk www.example.ca Please help and thanks so much! Cole
Intermediate & Advanced SEO | | ColeLusby0 -
Are Incorrectly Set Up URL Rewrites a Possible Cause of Panda
On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay. However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be. Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this. When I look at the source code of a product page, I see a reference to the /modules/products... I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks
Intermediate & Advanced SEO | | ABK7170 -
Content question about 3 sites targeted at 3 different countries
I am new here, and this is my first question. I was hoping to get help with the following scenario: I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia. Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages. My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
Intermediate & Advanced SEO | | ryanetc0 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
2013 Panda Update Question
Hi everyone, I'm new here 🙂 So far I've had wonderful success seo wise and none of the updates (Penguin nor Panda) affected any sites, until this one. For example, one site has 7 keywords I'm optimizing for. Out of those 7, all but 2 (and variations of the 2 - one word vs long-tail) completely tanked. These keywords were all on page 2/3. One of the two survivors never budged from page 2 (it's a brand keyword so I was sooo happy to finally get it to page 2) Now when I check rankings, the other terms show up in the 200-400 spots, but NOT for the URL I was optimizing for (category page) but instead for random products in the category. The only thing I've done differently with the 2 keywords that are still doing well, was focus - we did more link-building for those, but not an extreme amount. Never over-optimize. My question is, how did 2 survive and 5 are still floating up and down. Last night I saw one go up 122 spots, now today down 14. I'm really struggling with this. Thank you
Intermediate & Advanced SEO | | Freelancer130 -
Canonical URL Question
Hi Everyone I like to run this question by the community and get a second opinion on best practices for an issue that I ran into. I got two pages, Page A is the original page and Page B is the page with duplicate content. We already added** ="Page A**" />** to the duplicate content (Page B).** **Here is my question, since Page B is duplicate content and there is a link rel="canonical" added to it, would you put in the time to add meta tags and optimize the title of the page? Thanks in advance for all your help.**
Intermediate & Advanced SEO | | DRTBA0 -
On Page question
HI folks, I have a warning that I have missing meta tag descriptions on two pages. 1) http://bluetea.com.au/wp-login.php 2) http://bluetea.com.au/wp-login.php?action=lostpassword Is this something I should just ignore? Or is there a best practice I should be implementing? Thank you for your time
Intermediate & Advanced SEO | | PHDAustralia680