When Panda's attack...
-
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done.
First, the issues:
Content Length
The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it.
Visit Length as a Metric
There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place.
My strategy so far…
Noindex some Pages
Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value.
Create more click incentives
We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click.
Expand Content (of course)
The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel.
Site Redesign
Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page.
What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
-
Traffic (and income) is now down over 55% which is really too bad. The content is unique and highly valuable to the target market.
Any advice about why would be really appreciated.
-
All content is unique. Much of it is 10 years old.
It gets duplicated/syndicated to other sites: some legit, others we constantly fight to have removed. One in India completely copied the site from a few years ago and changed most of the links to internal addresses.
However, the owner wrote all of the non-quote or referenced material.
-
"Google watches how long a person uses a page to gauge it’s value"
Perhaps, but I wouldn't stress about that metric in particular. As you correctly pointed out, a visitor who is looking for a specific item and finds it will leave a site rather quickly.Is the content unique or duplicate?
EDIT: According to a quick check on Copyscape, your content is duplicated across other sites. You definitely need unique content as a starting point.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Importance on usability issues in sub directories or sub domains?
Hi Moz community, As the different usability issues like pagespeed or mobile responsiveness are playing a key role in website rankings; I wonder how much the same factors are important for sub directories or sub domain pages? Do each and every page of sub directory or sub domain must be optimised like website pages? Does Google gives same importance? Thanks
Algorithm Updates | | vtmoz0 -
Thoughts on Google's Autocomplete hurting organic SEO?
A client sent over an article about how Google's Autocomplete eliminates your chance for clicks. Saying that if your competitor is higher than you, the user will bypass the page one organic rank and click on a specific business from the autocomplete which in turn presents an entire page one result for that business. So in a sense they are wondering why they're doing organic SEO if potential customers are just going to bypass the page one organic results. I would love to hear thoughts from like minded people on this as I have to start proving my case with articles, facts, data, and research.
Algorithm Updates | | MERGE-Chicago0 -
What is the difference between to all Panda updates or algorithm?
I want to difference between to all updates of panda algorithm. How to differ each updates of Panda between to each other?What kind of changes each update Panda.Please reply soon.
Algorithm Updates | | renukishor0 -
Who's doing canonical tags right, The Gap or Kohls?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Considering the Panda algorithm updates, would you recommend reducing high amounts of inbound links from a single website?
My website has a significant number of inbound links (1,000+) from a single website, due to a sponsorship level contribution. Both my website and the other are authorities in the industry and in search results (PR of 5). Since even ethical websites can suffer a penalty from each iteration of Panda, I'm considering significantly removing the number of links from this website. Do you think that measurable change would be seen favorably by Google or would the drop in links be detrimental?
Algorithm Updates | | steelintheair0 -
Panda / Penguin Behavior ? Recovery?
Our site took a major fall on March 23rd, ie Panda 3.4 and then another smaller one on April 24th, ie Penguin. I have posted a few times in here trying get help on what items to focus on. Been doing this for 13 years, white hat, never chased algos but of course learned as I went. As soon as the fall hit one expert said it was links, which I kinda doubted because we never went after them but we have some but only a handful in comparison to really good authorative links. I concentrated on cleaning up duplicate content due to tags in a blog that only had 7 posts (an add on section to the site) then focuses efforts on just going through and making content better. Had other overlapping content that I would guess would pass inspection but I cleaned it up. After 6 weeks no movement back up, another expert here said yes, he saw some bad links so I should check it out. So back to focusing on links, I actually run a report and discover questionable links, and successfully get about 25 removed. Low numbers but we have only about 50 that were questionable. No contact info on the other directories so I guess we are stuck. Here is where I just go in circles... When our site fell on March 23rd we had 13 of our main pages still ranking at number 1 and 2 on each keyword phrase. Penguin hit and they fell about 10 spots. EXCEPT, one... This one keyword phrase and page stayed on top and ranked at #1 throught he storm. (finally fell to #4 but still remains up there). The whole site is down 90%, we only have 3 fair keyword phrases really ranking out of 250. The mystery is that the keyword phrase that was ranking was the one that supposedly had way over the % of anchor text, 7% of our links go to that page. The other pages that fell on Penguin had no pages linking back. I have been adding blog posts to our site, I post one an in a few days it gets indexed, have one of those ranking at #2 for the keyword, moved up from #4 a week after posting it in the blog. (google searches shows 80K) Just seems like the site should bounce back if new content is able to rank, why not the old? Did other people hit by Panda and Penguin see a sitewide fall or are they still ranking for some terms? I would love to see some discusson on success stories of bouncing back after Panda and Penguin. I see the WP success story but that was pretty sudden after it was brought to Google's attention. Looking for that small business that fixed something and saw improvement. Give me hope here please.
Algorithm Updates | | Force70 -
If we are getting clicks from a local one box as a citation in the serps's would we see this as the referrer in GA?
If we are getting clicks from a local one box as a citation in the serps's
Algorithm Updates | | Mediative
would we see this as the referrer in GA?0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1