When Panda's attack...
-
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done.
First, the issues:
Content Length
The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it.
Visit Length as a Metric
There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place.
My strategy so far…
Noindex some Pages
Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value.
Create more click incentives
We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click.
Expand Content (of course)
The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel.
Site Redesign
Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page.
What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
-
Traffic (and income) is now down over 55% which is really too bad. The content is unique and highly valuable to the target market.
Any advice about why would be really appreciated.
-
All content is unique. Much of it is 10 years old.
It gets duplicated/syndicated to other sites: some legit, others we constantly fight to have removed. One in India completely copied the site from a few years ago and changed most of the links to internal addresses.
However, the owner wrote all of the non-quote or referenced material.
-
"Google watches how long a person uses a page to gauge it’s value"
Perhaps, but I wouldn't stress about that metric in particular. As you correctly pointed out, a visitor who is looking for a specific item and finds it will leave a site rather quickly.Is the content unique or duplicate?
EDIT: According to a quick check on Copyscape, your content is duplicated across other sites. You definitely need unique content as a starting point.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do the back-links go wasted when anchor text or context content doesn't match with page content?
Hi Community, I have seen number of back-links where the content in that link is not matching with page content. Like page A linking to page B, but content is not really relevant beside brand name. Like page with "vertigo tiles" linked to page about "vertigo paints" where "vertigo" is brand name. Will these kind of back-links completely get wasted? I have also found some broken links which I'm planning to redirect to existing pages just to reclaim the back-links even though the content relevancy is not much beside brand name. Are these back-links are beneficial or not? Thanks
Algorithm Updates | | vtmoz0 -
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
Panda...Should I consolidate...Like this...
I'm torn. Many of our 'niche' ecommerce products rank ok, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Algorithm Updates | | saultienut0 -
What's the correct format when you Disavow a single page? with or without www.?
Hi Y'all. Can't seem to find an article on disavowing a single page. Do i use A, B, or submit both A and B? Example: A. http://disavowexample.com B. http://www.disavowexample.com Which one does Google prefer? I know for some I just find the canonical url of the page (which show www,) but wanted your expert advice! Thanks
Algorithm Updates | | Shawn1240 -
Undertanding Google's PMD (Partial Matching Domain) policy...
Hi, If your business name contains keywords, is that an issue? Some companies, have keyword based brand names... So what is Google's policy regarding EMD or PMD? What happens when the company name has a keyword in it? If anyone could help clarify, I would appreciate it. Thanks, Ben
Algorithm Updates | | bjs20100 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Regarding google panda: would it be wise to use automatic generated content when there is no content.
Hi guys, i am currently creating a local business directory and was deciding when we first start there will be a lot of business listings without a business decription until the owner of that business come to submit a description. so when if a business listing have no business description would it be better to have an automatic generated business description like this:
Algorithm Updates | | usaccess608
www.startlocal.com.au/retail/books/tas_hobartandsouth/Scene_Magazine_2797040.html the automated genrated description for this listing on that page is:
Scene Magazine is a business that is based in Kingston, 7050, TAS: Hobart And South. Scene Magazine is listed in 2 categories including: Magazines and Periodicals Shops and Book Stores and Shops. Within the Magazines and Periodicals Shops category there are 5 businesses within 25 km of Scene Magazine. Some of those businesses included within the radius of 25 km are Island Magazine, Artemis Publishing Consultants and Bride Tasmania Magazine. would google panda affect this or not and would it be wise to use this auto content when there is no description for a business?0 -
Content below the fold and Panda Update
Hi I was at the linklove conference and I heard some worrying stories about the way content is formatted on a page being a factor in ehow has avoided being slapped. It was the first time I had heard the expression "below the fold..." I am producing some very sexy SERP's results and other sexier metrics are up too but I am concerened that thefurnituremarket.co.uk has a ton of images on the home page and the nice content is below all of them.. firstly is this content..."below the fold"? secondly I know the site is old but do you think when this panda update hits the UK... were will be penalised for the look of the site.. I know there was talk yesterday at the conference of coming up woth a tool to check this out... my gut says that this will be a factor... sooner rather than later hence I am looking at magento and how we can skin it to look nice and present products better.. I would be really interested to know what exactly is "below the fold" on the furnituremarket.co.uk and some thoughts on the whole ehow formatting issue..
Algorithm Updates | | robertrRSwalters0