They are not treated as the same.
In fact, use of sub-domains is one way to combat Panda by shifting low value content off the main site.
http://www.conversationmarketing.com/2011/07/wsj-wtf-google-panda-subdomains.htm
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
They are not treated as the same.
In fact, use of sub-domains is one way to combat Panda by shifting low value content off the main site.
http://www.conversationmarketing.com/2011/07/wsj-wtf-google-panda-subdomains.htm
Ha, true enough! My dyslexia got the better of my math again!
8/22/2011 is 4 6 days from now ????
Check out the companies in the CRM and Email Marketing panes as a start...
http://www.layeredi.com/sites/grange.drupalgardens.com/files/marketing_technology_landscape.jpg
I know MailChimp and Safesforce have JS widgets. I like Campaign Monitor myself. I bet many of the others do too.
If you are moving from Tomcat, might be a good time to evaluate a CMS too.
I'd like some feedback on what would be a Panda factor(s) on http://www.duhaime.org
The site got hit fairly hard by Panda (60% drop in traffic). Since then we have:
The site, by nature of being a legal reference, contains many small, single topic pages. The authority and professionalism of the site demands it doesn't allow much UGC. (The law is not Social Media)
Unfortunately, this means the bulk of the page (1000+) are fairly formulaic - to format otherwise would diminish the value to the user. This doesn't hurt many individual pages as shown by the dominance the "Without Prejudice" page enjoys.
In particular, the citations section was very limited as there is not very much one can say about the 10,000+ law reports in the world. Recognizing this as valuable to lawyers but a likely "low value" target of Panda, we moved it to a sub domain and requested the old directory was removed from the index. This was done on July 28th.
Now...
I'd like some opinions on anything else that might be holding the site back.
Thank you for you time.
Traffic (and income) is now down over 55% which is really too bad. The content is unique and highly valuable to the target market.
Any advice about why would be really appreciated.
All content is unique. Much of it is 10 years old.
It gets duplicated/syndicated to other sites: some legit, others we constantly fight to have removed. One in India completely copied the site from a few years ago and changed most of the links to internal addresses.
However, the owner wrote all of the non-quote or referenced material.
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done.
First, the issues:
Content Length
The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it.
Visit Length as a Metric
There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place.
My strategy so far…
Noindex some Pages
Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value.
Create more click incentives
We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click.
Expand Content (of course)
The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel.
Site Redesign
Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page.
What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?