When Panda's attack...
-
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done.
First, the issues:
Content Length
The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it.
Visit Length as a Metric
There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place.
My strategy so far…
Noindex some Pages
Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value.
Create more click incentives
We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click.
Expand Content (of course)
The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel.
Site Redesign
Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page.
What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
-
Traffic (and income) is now down over 55% which is really too bad. The content is unique and highly valuable to the target market.
Any advice about why would be really appreciated.
-
All content is unique. Much of it is 10 years old.
It gets duplicated/syndicated to other sites: some legit, others we constantly fight to have removed. One in India completely copied the site from a few years ago and changed most of the links to internal addresses.
However, the owner wrote all of the non-quote or referenced material.
-
"Google watches how long a person uses a page to gauge it’s value"
Perhaps, but I wouldn't stress about that metric in particular. As you correctly pointed out, a visitor who is looking for a specific item and finds it will leave a site rather quickly.Is the content unique or duplicate?
EDIT: According to a quick check on Copyscape, your content is duplicated across other sites. You definitely need unique content as a starting point.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
What is your hypothesis why Panda/Penguin recoveries happen over months after an algorithm update rather than over night?
We have experienced many scenarios were ranking recoveries from clear Panda and Penguin penalties on our sites don't necessarily happen with the launch of a Panda/Penguin update but instead trickle back in over weeks and months after a confirmed algo update. A good example is shown in the image which shows a panda recovery for a high volume keyword. What is your theory why these ranking recoveries happen over weeks vs instantly? qCWliLF
Algorithm Updates | | italiansoc0 -
What's better for seo, NOINDEX, or INDEX
Hello Mozers; I am having an issue, my client has 10K pages on their site; in WP, and they have a classified section. Question #1: I am asking, what's better for seo, NOINDEX, or INDEX, for their Classified section. They currently have no SEO plug ins, that fix their errors, and warnings. Question #2: My question is also, do I want the Categories crawled, or INDEXED or NOINDEX? Check out their Campaign results by Moz: Title Element Too Long (> 70 Characters) 32 Too Many On-Page Links 9,032 Missing Meta Description Tag 6,234
Algorithm Updates | | smstv0 -
Panda, Negative SEO and now Penguin - help needed
Hi,
Algorithm Updates | | mlm12
We are small business owners who've been running a website for 5 years that provides our income. We've done very little backlinking ourselves, and never did paid directories or anything like that - usually just occasional forum or blog responses. A few articles here and there with some of our keyword phrases for internal pages. Of course I admit we've done some kwp backlinks on some blogs, but our anchor text profile is largely brand names and our domain name and non keywords (excepting for some "bad" backlinks). Our DA is 34, PA 45 for our home page. We were doing great until last Sept 27 when we got hit by Panda and have been working on deoptimizing our site for keywords, we made a new site in Wordpress for good architecture and ease of use for our customers, and we're deleting/repurposing low quality pages and making our content more robust. We haven't yet recovered from this and now it appears we got hit May 22 for Penguin...ARGH! I recently discovered (hard to have time to devote to everything with just two of us) that others can "negative seo" a site now and I feel this has happened based upon results below... I signed up for linkdetox.com yesterday and it gives a grim picture of our backlinks (says we are in "deadly risk" territory). We have 83 "toxic" links and 600 some "suspicious" links (many are in malware/malicious listed sites, many are .pl domains from Poland, others are I believe foreign domains, or domains that are a bunch or letters that make no sense, or spammy sounding emd domains), - this makes up 80% of our links. As this is our only business, our income is now 1/3 of what it has been, even with PPC ads going as we've been hit hard by all of this and are wondering if we can survive fixing this. We do have an SEO firm minimally helping us along with guidance on recovering, but with income so low, we are doing the work ourselves and can't afford much. Needless to say, we are quite distressed and from reading around, not sure if we'll be able to recover and that is deeply saddening, especially from Negative SEO. We want to make sure we are on the right path for recovery if possible, hence my questions. We haven't been in contact with Google for reconsideration, again, no penalty messages from them. First of all, if we don't have a manual penalty, would you still contact all the toxic/malicious/possible porn looking sites and ask for a link removal, wait, ask for link removal, wait then disavow? Or just go straight to Google disavow? For backlinks coming from sites that are "gone" (like a message saying the account has been suspended), or there is no website there anymore, do I try and contact them too? Or go direct to disavow? Or do nothing? For the sites flagged as malicious (by linkdetox, my browser, or by Google), I don't want to try and open them on my browser to see if this site is legitimate. If linkdetox doesn't have the contact info for these - what are we supposed to do? For "suspicious" foreign sites that I can't read the webpage -would you still disavow them (I've seen many here say links from foreign sites should be disavowed). How do you keep up with all this is someone is negative SEOing you? We're really frustrated that Google's change has made it possible for competitors to tank your business (arguably though, if we had a stronger backlink profile this may not have hurt, or not as much - not sure). When you are small biz owners and can't hire a group to constantly monitor backlinks, get quality backlinks, content, site optimization, etc - it seems an almost impossible task to do. Are wordpress left nav and footer link anchor text an issue for Penguin? I would think Google would realize these internal links will be repetitive for the same anchor text on Wordpress (I know Matt Cutts said to not use the same anchor text more than once for internal linking -but obviously nav and footer menus will do this). What would you do if this was you? Try and fix it all? Start over with a new domain and 301 it (some say this has been working)? Just start over with a new domain and don't redirect? Thanks for your input and advice. We appreciate it.0 -
301'ing away from an exact match domain.
Hi Moz Community! My website gets just over 50% of its traffic from ranking in the top 3 in over 10 countries for my exact match keyword domain. 80% + from keywords related to the exact match domain. We are now looking at doing a to 301 re-direct to a new domain to start a fresh branding to the site to increase scope and expand. This would involve removing the keyword from the homepage and domain entirely . However. Considering all competitors ranking for our main keyword, have the keyword in their domain as either a subdomain to or in their root domain and in their homepage content, would this make ranking without the keyword in domain & content hard? I have found a very similar example that has done so, so I guess the answer to that question is no its not. about 65-70% of our anchor text on our backlinks is for our domain keyword. Can anyone advise how best to go about maintaining rankings after 301ing or how best to go about 301ing to make sure that we can maintain the rankings for our main keyword! Any advise at all would be greatly appreciated, Thanks.
Algorithm Updates | | howiex10 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
Interesting SERP trend I'm observing
I know Google has been favoring brands a big names lately, but I'm seeing something a bit more alarming Our company offers custom embroidered patches, and through keyword and search research I have discovered that almost all searches for "embroidered patches" are by people who need embroidered patches and are looking to purchase them, or learn more about the process of purchasing them. The SERPs for this term used to be all embroidered patch companies such as ours. In the past month: We've been outranked by a page on Amazon that's fairly irrelevant. An equally irrelevant ebay page has emerged The Wikipedia page for "embroidered patch" is now number seven. This has pushed three other embroidered patch companies off the first page (not that I'm complaining because it wasn't our company . . . yet). My question is, has anyone else noticed something similar happening, where large sites are gaining ground, in spite of the fact that they have low relevance to the search term?
Algorithm Updates | | UnderRugSwept0 -
How did a competitor's brand name get in google's related search list?
When doing a google search for the term "ulster county real estate" the related search list at the bottom of the serp includes 7 obviously related search terms and 1 brand name of a competitor. (see attachment) The competitor doesn't rank for this term organically at all yet he enjoys a link on the first page with those of us that do by being in the related search list? I don't get it. Anyone know how something like this happens? Innhs.png
Algorithm Updates | | jhogan801