From your perspective, what's wrong with this site such that it has a Panda Penalty?
-
For more background, please see:
http://www.seomoz.org/q/advice-regarding-panda
http://www.seomoz.org/q/when-panda-s-attack
(hoping the third time's the charm here)
-
Its cool, your previous questions didn't really get answered... and my answer was posted twice so above is the edited one. Whoops!
-
- Light content is an issue for many pages by the nature of the content. This is why we moved the entire Citations section to a sub domain. Combining them would be near impossible without diminishing the value to the human visitors - lawyers rarely have time to wade through arbitrary lists. I really can't think of a way to combine the page in a meaningfull way.
We have combined other areas such as the law quotations and I will search for more canditates.
I will note, pages below a certain character threshold have a noindex tag on them now.
-
Above.
-
Actually, the pages have around 35 links per page according to GoogleBot. The menu and the footer are loaded via AJAX after the visitor interacts with the site. The home page is an anomaly.
-
Hehe, caught me.
Just, duplicate content isn't that big a factor for Panda that I can see. It appears focus on the quality of the content (as judge by humans in a study).
It may well be hurting the site in general however.
-
Speaking of duplicate content...
-
I imagine there are a few potential causes:
1. Light content. You can fix this by combining the pages for terms together, and using anchor tags to point the user down where they want to go. On your front page include more of the post - right now it seems like the intro blurb is only several words long.
2. Duplicated widely. You mentioned this in another question, and I'm not sure what else to do here. You're already using rel canonical which would be my advice.
3. Tons of links on every page. Your footer has a ton of links, and the menu is quite large to begin with. Consider removing most or all of those footer links.
Best of luck!
-
The site is a legal dictionary and reference so literally,1000's of legal definitions, topics and terms.
A targeted case would be made for "Legal Dictionary" but the site still gets OK results from that search. It was much better before Panda - most keywords are off by about 60% in terms of traffic
-
What are you trying to rank for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
How can I get the most out of uploading a print magazine to my client's website?
Hi Mozers, My client is just about to launch a print magazine for her watch business. There is so much valuable content in the magazine and we want to feature it on the website both for SEO purposes and also for those who prefer to read articles online instead of reading a physical magazine. My question is: what is the best method of displaying the magazine to get the most from search rankings and also to capitalise on the beautiful imagery from the magazine. The best option that I can think of is to upload the magazine as a flipbook and create a separate page on the website to display each article so that search engine crawlers can index the content. I do understand that this could be problematic if users are only spending time reading the flipbook and not so much time on the article pages. Do you guys have any suggestions about how to get the most out of this opportunity for my client? THANK YOU IN ADVANCE. Meaghan
Technical SEO | | StoryScout0 -
Yoast's Magento Guide "Nofollowing unnecessary link" is that really a good idea?
I have been following Yoast's Magento guide here: https://yoast.com/articles/magento-seo/ Under section 3.2 it says: Nofollowing unnecessary links Another easy step to increase your Magento SEO is to stop linking to your login, checkout, wishlist, and all other non-content pages. The same goes for your RSS feeds, layered navigation, add to wishlist, add to compare etc. I always thought that nofollowing internal links is a bad idea as it just throwing link juice out the window. Why would Yoast recommend to do this? To me they are suggesting link sculpting via nofollowing but that has not worked since 2009!
Technical SEO | | PaddyDisplays0 -
I have a 404 error on my site i can't find.
I have looked everywhere. I thought it might have just showed up while making some changes, so while in webmaster tools i said it was fixed.....It's still there. Even moz pro found it. error is http://mydomain.com/mydomain.com No idea how it even happened. thought it might be a plugin problem. Any ideas how to fix this?
Technical SEO | | NateStewart0 -
What's our easiest, quickest "win" for page load speed?
This is a follow up question to an earlier thread located here: http://www.seomoz.org/q/we-just-fixed-a-meta-refresh-unified-our-link-profile-and-now-our-rankings-are-going-crazy In that thread, Dr. Pete Meyers said "You'd really be better off getting all that script into external files." Our IT Director is willing to spend time working on this, but he believes it is a complicated process because each script must be evaluated to determine which ones are needed "pre" page load and which ones can be loaded "post." Our IT Director went on to say that he believes the quickest "win" we could get would be to move our SSL javascript for our SSL icon (in our site footer) to an internal page, and just link to that page from an image of the icon in the footer. He says this javascript, more than any other, slows our page down. My question is two parts: 1. How can I verify that this javascript is indeed, a major culprit of our page load speed? 2. Is it possible that it is slow because so many styles have been applied to the surrounding area? In other words, if I stripped out the "Secured by" text and all the syles associated with that, could that effect the efficiency of the script? 3. Are there any negatives to moving that javascript to an interior landing page, leaving the icon as an image in the footer and linking to the new page? Any thoughts, suggestions, comments, etc. are greatly appreciated! Dana
Technical SEO | | danatanseo0 -
Merged old wordpress site to new theme and have crazy amount of 4xx and duplicate content that wasn't there before?
URL is awardrealty.com We have a new website that we merged into a new wordpress theme. I just crawled the site with my seomoz crawl tool and it is showing a ridiculous amount of 4xx pages (200+) and we cant find the 4xx pages in the sitemap or within wordpress. Need some help? Am i missing something easy?
Technical SEO | | Mark_Jay_Apsey_Jr.0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0