Site has disappeared since Panda 4 despite quality content, help!
-
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0.
All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high.
The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty).
Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively?
Would really appreciate any help.
-
With a bounce rate that low, do you by chance have multiple GA tracking scripts, or something that's triggering an event even if no one goes to another page? Look at the source code when you're in incognito mode, in case your CMS suppresses one of the GA codes when you're logged in as admin.
[voice of experience and learning the hard way here!]
-
I would say you are bordering on over-optimization. Your alt tags are a bit spammy, you are using the keyword meta tag ( a spam signal), you're using both tags and categories within Wordpress, which can cause duplication.
I agree with the other posters ( and your post on Google), not Panda related - just viewed again by Google.
-
EGOL again thank you for your help it is highly appreciated. Bounce rate is really low at 4% so I am not sure it is that but I take your point. The target audience is more 25-45 really as my client offers Urban Conditioning which would potentially be too much for someone over 45.
Maybe you are right about the pixel info, I think that can be solved by toning down some of the heavy media.
My only issue with both responses here is that it was ranking highly before panda. Where you are pointing out general optimisations and not Panda specific, so what I really need to know is what panda 4.0 has picked up on.
-
Thank you for your comments Lee. I agree that it is a bit media heavy as this was the request when the site was built, we could do with altering it really so that there is just the video or just the slider. The disavow was done well after the rankings drop so I doubt it will be the case.
-
I took a quick look at the site and agree with Lee. The content is good, could be a little thicker but that is probably not the problem.
Just tossing something out... a lot of space is given to huge images, huge whitespace, huge video, huge navigation.... So much that the first word of content is 800 pixels down on the content pages and over 1000 pixels down on the homepage.
So, I am wondering about two things.... 1) are people not going down to look at the content and instead bouncing? 2) are search engines seeing no content in the first thousand pixels and giving you a demotion.
Finally... and I am just saying this, knowing nothing of the business in specific, but being a person who has spent a long life in very intense athletics. Between ages 15 and 45 I would have been one of your best clients. Now, decades later, I am still someone's client, but not a client that matches my first impression of your website. So, if your biz matches the images on the website then you have no need to change. But, if your potential clients are below that intensity then they could be bouncing off of the website, in search for something less rigorous. The images are much higher than my impression of "physical wellbeing".
Maybe you have heard this famous quote that I read in a climbing magazine decades ago.... "The demands of the sport attract a certain type of person.... but at the same time severely limit its appeal."
-
Whilst it is possible that Panda had something to do with your ranking drop I don't think so looking at your site. You have well structured text although it could certainly be thicker on your main pages. Your homepage current has 264 words including headers. Understandably it is challenging creating 100% unique content that is actually high quality whilst being a good length to optimize for Panda.
I would say that where you have disavowed links it is possible that you have disavowed some links which were in fact helping your ranking as opposed to hurting it. This has happened to a number of my clients and the solution is simply to work hard a building natural links.
Alternatively, your homepage is very media heavy (despite being well optimized for page speed). This wouldn't explain such a dramatic drop but it would certainly improve your rank as well as your bounce to bring that load time down. Sliders and videos on the same page even with deferring the JS still add a large data load. Yoast explains this better than I could.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Me Change My Client's Mind
My client wants to build a second site to provide targeted links for SEO to his main site. He's interested in buying a TLD with some near topic authority/links and then build the second site's authority up from there. He is clear that this could get him in trouble for a link scheme, but thinks it can all be hidden from Google. Off the top of my head I was able to recall a few of the pain-in-the-neck things you'd have to do to not get caught, but he seemed unconvinced. I recall you'd have to have: Different registrar Different contact/WhoIs Different site host Different G/A, GWT Logging into second's site's G/A, GWT with different IP address not used for main domain With the exception of the last one, he didn't seem to think it would be too hard. Aren't there more difficult maneuvers required for hiding this from Google? I want to be able to point out to him how ridiculous this low integrity effort will be, without losing the client. Thanks! Best... Darcy
Algorithm Updates | | 945010 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Local Pages Help
Hi All, I have a client who is looking heavily at Google+ Local. He has a main business, with a number of locational franchises. He has created a local listing for each of these franchise pages. The question he has asked is 'How do I improve my rankings for these local listings?' Now some of them seem to rank well without any work performed to improve them, but some are not. My question is, What can we do to improve the rankings of Google+ Local listings? This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!! Many thanks Guys!!
Algorithm Updates | | Webrevolve0 -
Should Your Keep Out Of Stock Item Active On Your Site ?
If you have sold out products that will never come back in stock. Should you remove the items and urls from your sitemap and site. Or should you keep them active with a sold out image. The purpose would be for search engines will think your site is larger due the products and amount of urls you have ?
Algorithm Updates | | TeamLogo0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Google and Content at Top of Page Change?
We always hear about how Google made this change or that change this month to their algorithm. Sometimes it's true and other times it's just a rumor. So this week I was speaking with someone in the SEO field who said that this week a change occurred at Google and is going to become more prevalent where content placed at the "top of the fold" on merchant sites with products are going to get better placement, rather than if you have your products at top with some content beneath them at the bottom of the page. Any comments on this?
Algorithm Updates | | applesofgold0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0