From your perspective, what's wrong with this site such that it has a Panda Penalty?
-
For more background, please see:
http://www.seomoz.org/q/advice-regarding-panda
http://www.seomoz.org/q/when-panda-s-attack
(hoping the third time's the charm here)
-
Its cool, your previous questions didn't really get answered... and my answer was posted twice so above is the edited one. Whoops!
-
- Light content is an issue for many pages by the nature of the content. This is why we moved the entire Citations section to a sub domain. Combining them would be near impossible without diminishing the value to the human visitors - lawyers rarely have time to wade through arbitrary lists. I really can't think of a way to combine the page in a meaningfull way.
We have combined other areas such as the law quotations and I will search for more canditates.
I will note, pages below a certain character threshold have a noindex tag on them now.
-
Above.
-
Actually, the pages have around 35 links per page according to GoogleBot. The menu and the footer are loaded via AJAX after the visitor interacts with the site. The home page is an anomaly.
-
Hehe, caught me.
Just, duplicate content isn't that big a factor for Panda that I can see. It appears focus on the quality of the content (as judge by humans in a study).
It may well be hurting the site in general however.
-
Speaking of duplicate content...
-
I imagine there are a few potential causes:
1. Light content. You can fix this by combining the pages for terms together, and using anchor tags to point the user down where they want to go. On your front page include more of the post - right now it seems like the intro blurb is only several words long.
2. Duplicated widely. You mentioned this in another question, and I'm not sure what else to do here. You're already using rel canonical which would be my advice.
3. Tons of links on every page. Your footer has a ton of links, and the menu is quite large to begin with. Consider removing most or all of those footer links.
Best of luck!
-
The site is a legal dictionary and reference so literally,1000's of legal definitions, topics and terms.
A targeted case would be made for "Legal Dictionary" but the site still gets OK results from that search. It was much better before Panda - most keywords are off by about 60% in terms of traffic
-
What are you trying to rank for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Omitted Results - Attempt to De-Index
We're trying to get webpages from our QA site out of Google's index. We've inserted the NOINDEX tags. Google now shows only 3 results (down from 196,000), however, they offer a link to "show omitted results" at the bottom of the page. (A) Did we do something wrong? or (B) were we successful with our NOINDEX but Google will offer to show omitted results anyway? Please advise! Thanks!
Technical SEO | | BVREID0 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
How to Identify Which Penalty : Penguin, Panda or Other?
I'm in the process of putting together a plan to recover from Algorithmic penalty. I'm not sure if I have to focus my recovery effort based on Penguin, Panda or Other algorithm penalty. After looking at the attached screenshot : Google Analytics Data vs Google Algorithm update timeline, I'm not sure if the blog is affected due to Penguin or Panda. I have following questions Traffic drop is because of Pengin, Panda or Other penalty? (there is no manual penalty message) Where should I focus my time with recovery efforts? (link removal, contents, link building, etc). Any other comments or suggestions? Thanks for you help. cSFZqj7
Technical SEO | | rsmb0 -
What's the latest on Title Tags?
What is the latest on what Google is looking for? Keyword one, Keyword two? Sentences with the Keyword in them?
Technical SEO | | netviper0 -
What's the best URL Structure if my company is in multiple locations or cities?
I have read numerous intelligent, well informed responses to this question but have yet to hear a definitive answer from an authority. Here's the situation. Let's say I have a company who's URL is www.awesomecompany.com who provides one service called 'Awesome Service' This company has 20 franchises in the 20 largest US cities. They want a uniform online presence, meaning they want their design to remain consistent across all 20 domains. My question is this; what's the best domain or url structure for these 20 sites? Subdomain - dallas.awesomecompany.co Unique URL - www.dallasawesomecompany.com Directory - www.awesomecompany.com/dallas/ Here's my thoughts on this question but I'm really hoping someone b*tch slaps me and tells me I'm wrong: Of these three potential solutions these are how I would rank them and why: Subdomains Pros: Allows me to build an entire site so if my local site grows to 50+ pages, it's still easy to navigate Allows me to brand root domain and leverage brand trust of root domain (let's say the franchise is starbucks.com for instance) Cons: This subdomain is basically a brand new url in google's eyes and any link building will not benefit root domain. Directory Pros Fully leverages the root domain branding and fully allows for further branding If the domain is an authority site, ranking for sub pages will be achieved much quicker Cons While this is a great solution if you just want a simple map listing and contact info page for each of your 20 locations, what if each location want's their own "about us" page and their own "Awesome Service" page optimized for their respective City (i.e. Awesome Service in Dallas)? The Navigation and potentially the URL is going to start to get really confusing and cumbersome for the end user. Think about it, which is preferable?: dallas.awesomcompany.com/awesome-service/ www.awesomecompany.com/dallas/awesome-service (especially when www.awesomecompany.com/awesome-service/ already exists Unique URL Pros Potentially quicker rankings achieved than a subdomain if it's an exact match domain name (i.e. dallasawesomeservice.com) Cons Does not leverage the www.awesomecompany.com brand Could look like an imposter It is literally a brand new domain in Google's eyes so all SEO efforts would start from scratch Obviously what goes without saying is that all of these domains would need to have unique content on them to avoid duplicate content penalties. I'm very curious to hear what you all have to say.
Technical SEO | | BrianJGomez0 -
Is it worth changing our blog post URL's?
We're considering changing the URL's for our blog posts and dropping the date information. Ex. http://spreecommerce.com/blog/2012/07/27/spree-1-1-3-released/ changes to http://spreecommerce.com/blog/spree-1-1-3-released/ Based on what I've learned here the new URL is better for SEO but since these pages already exist do we risk a minor loss of Google juice with 301 redirects? We have a sitemap for the blog posts so I imagine this wouldn't be too hard for Google to learn the new ones.
Technical SEO | | schof0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0