From your perspective, what's wrong with this site such that it has a Panda Penalty?
-
For more background, please see:
http://www.seomoz.org/q/advice-regarding-panda
http://www.seomoz.org/q/when-panda-s-attack
(hoping the third time's the charm here)
-
Its cool, your previous questions didn't really get answered... and my answer was posted twice so above is the edited one. Whoops!
-
- Light content is an issue for many pages by the nature of the content. This is why we moved the entire Citations section to a sub domain. Combining them would be near impossible without diminishing the value to the human visitors - lawyers rarely have time to wade through arbitrary lists. I really can't think of a way to combine the page in a meaningfull way.
We have combined other areas such as the law quotations and I will search for more canditates.
I will note, pages below a certain character threshold have a noindex tag on them now.
-
Above.
-
Actually, the pages have around 35 links per page according to GoogleBot. The menu and the footer are loaded via AJAX after the visitor interacts with the site. The home page is an anomaly.
-
Hehe, caught me.
Just, duplicate content isn't that big a factor for Panda that I can see. It appears focus on the quality of the content (as judge by humans in a study).
It may well be hurting the site in general however.
-
Speaking of duplicate content...
-
I imagine there are a few potential causes:
1. Light content. You can fix this by combining the pages for terms together, and using anchor tags to point the user down where they want to go. On your front page include more of the post - right now it seems like the intro blurb is only several words long.
2. Duplicated widely. You mentioned this in another question, and I'm not sure what else to do here. You're already using rel canonical which would be my advice.
3. Tons of links on every page. Your footer has a ton of links, and the menu is quite large to begin with. Consider removing most or all of those footer links.
Best of luck!
-
The site is a legal dictionary and reference so literally,1000's of legal definitions, topics and terms.
A targeted case would be made for "Legal Dictionary" but the site still gets OK results from that search. It was much better before Panda - most keywords are off by about 60% in terms of traffic
-
What are you trying to rank for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clean URL vs. Parameter URL and Using Canonical URL...That's a Mouthfull!
Hi Everyone, I a currently migrating a Magento site over to Shopify Plus and have a question about best practices for using the canonical URL. There is a competitor that I believe is not doing it the correct way, so I want to make sure my way is the better choice. With 'Vendor Pages' in Shopify, they show up looking like: https://www.campusprotein.com/collections/vendors?q=Cellucor. Not as clean. Problem is that Shopify also creates https://www.campusprotein.com/collections/cellucor. Same products, same page, just a different more clean URL. I am seeing both indexed in Google. What I want to do is basically create a canonical URL from the URL with the parameter that points to the clean URL. The two pages are very similar. The only difference is that the clean URL page has some additional content at the top of the page. I would say the two pages are 90% the same. Do you see any issue with that?
Technical SEO | | vetofunk0 -
How Many Words To Make Content 'unique?'
Hi All, I'm currently working on creating a variety of new pages for my website. These pages are based upon different keyword searches for cars, for example used BMW in London, Used BMW in Edinburgh and many many more similar kinds of variations. I'm writing some content for each page so that they're completely unique to each other (the cars displayed on each page will also be different so this would not be duplicated either). My question is really, how much content do you think that I'll need on each page? or what is optimal? What would be the minimum you might need? Thank for your help!
Technical SEO | | Sandicliffe0 -
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
What would you do if a site's entire content is on a subdomain?
Scenario: There is a website called mydomain.com and it is a new domain with about 300 inbound links (some going to the product pages and categories), but they have some high trust links The website has categories a, b, c etc but they are all on a subdomain so instead of being mydomain.com/categoryA/productname the entire site's structure looks like subdomain.mydomain.com/categoryA/productname Would you go to the effort of 301ing the subdomain urls to the correct url structure of mydomain.com/category/product name, or would you leave it as it is? Just interested as to the extent of the issues this could cause in the future and if this is something worth resolving sooner than later.
Technical SEO | | Kerry220 -
Access To Client's Google Webmaster Tools
Hi, What's the best/easiest way for a client to grant access to his Google Webmaster Tools to me? Thanks! Best...Michael
Technical SEO | | 945010