Yet another Panda question
-
Hi Guys,
I'm just looking for confirmation on something.....
In the wake of Panda 2.2 one of my pages has plummeted in the rankings whilst other similar pages have seen healthy improvements.
Am I correct in thinking that Panda effects individual pages and doesn't tar an entire site with the same brush?
Really I'm trying to see if Panda is the reason in the drop on one page or whether it could be something else. The page in question has dropped 130 positions - not just a general fluctuation.
Thanks in advance for your responses!!!
-
Did you add any links to that particular page? I have seen a handful of links with similar anchor text published at the same time kill a ranking.
-
Elias, as a pro member, you do get one private Q&A question a month. You can submit your question with URL details to private Q&A and only SEOmoz staff members and associates can view the question, and it won't appear in any searches or indexes or be visible to anybody besides the staff and associates.
-
hmm without more info it's hard, like hip-shooting on one hand, blindfolded. But maybe you could check the bounce rate ex. in Analytics to see if your visitors also find the content good. (If they feel like browsing on after they read the landing page.)
Just because you find it high quality doesn't mean that your users or google agrees
-
yeah all content is good and original. There is a canonical in place to avoid duplication - very confused - maybe it will just sort itself out!
Thanks for your help
-
have your checked for duplicate content on site and off site or if the page is getting indexed with 2 different url's?
The second page could also be poor quality content meaning, almost no text ex..But it is kinda hard to point you in the right direction without more info
-
I'd love to but it is a bit sensitive. There is no difference between the coding of the too pages or the structure. The only difference is the on-page content and perhaps some internal links.
-
There could be quite a few reasons, there is no short answer for that.
could you show us 2 examples? 1 page that has been penalized and one that haven't?
-
Thanks, I'm kind of leaning towards the problem not being panda related.
Has anybody ever experienced such major drops for any other reason?
-
I do believe that the common consensus is that the panda affects the entire domain if it penalizes pages then some of that penalty will rub off on the rest of the domain.
My advice take the penalised pages down while you rework them. That should solve your problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when a de-indexed subdomain is redirected to another de-indexed subdomain? What happens to the link juice?
Hi all, We are planning to de-index and redirect a sub domain A to sub domain B. Consequently we now need to d-index sub domain B also. What happens now to the link juice or page rank they gained from hundreds and thousands of backlinks? Will there be any ranking impact on main domain? Backlinks of these sub domains are not much relevant to main domain content. Thanks
Algorithm Updates | | vtmoz1 -
Google Panda July 2016
Hi Does anyone know what impact the recent slow Panda roll out may have? Obviously content, but would it perhaps include engagement/user behaviour factors regarding your on page content too? Thanks
Algorithm Updates | | BeckyKey0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Do panda/penguin algorithm updates hit websites or just webpages ?
If I have a website that been affected by the panda/penguin update, do bad links affect the entire site or just the page the bad link(s) are linked to? If it is the latter and penguin/panda actually affect webpages, not websites (as is the common reference/conception), then wouldn't simply creating a new URL, targeting this new URL, shifting meta-tags and restarting link-building efforts again (this time using the right quality strategies) be a really common-sense approach instead of the tediousness of the disavow approach that so many go down?
Algorithm Updates | | Gavo0 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Did anyone else notice all their keyword rankings go down after the last Panda refresh on January 17th 2013?
Even before January 17th I noticed my keyword ranking slowly going from the top 3 to around 8, 9 and 10. Then between January 15 and January 30th, (SEO MOZ) is not showing the exact date) they all went down to the second page and worse. The rankings dropped for an e-commerce website petsspark.com. They sell a tear stain removal product which is a pretty competitive market. After January i started to notice that Google was starting to rank blogs, forums, overal product review websites and of course amazon, better than me and my competitors. Was anyone else effected by the panda refresh or have any idea what may have gone wrong? Please help ScreenShot2013-04-10at50852PM.png?t=1365628252
Algorithm Updates | | DTOSI1 -
Classifieds and Google Panda
It seems Google's Panda update is targetting low quality sites with little unique content (I know there's more to it than that). It makes sense that they may want to do this but what about classified sites. They may use some scraped content as well as unique ads, and the ads may lack content as they rely on the users writing the ads. However, they are helpful to the people that use classifieds. Because of these factors, these sites are suffering with the release of the latest Panda update. Any advice for classified sites and how they can combat the rankings drops???
Algorithm Updates | | Sayers0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0