Panda, rankings and other non-sense issues
-
Hello everyone
I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G:
http://www.virtualsheetmusic.com/
Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while.
Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it.
Take for example this competitors of ours:
They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them.
Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first.
As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st.
So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around.
Thoughts???!!!
-
Thank you very much Julie, I really appreciated your words. I have wondered so many times what Google think of "quality", and why before us there are always very low quality websites distributing the exact same music for free (often copyrighted music, which is illegal) and most of those sites are full of ads. Is that quality?
We could open a new discussion thread on the "What is quality to Google?" topic, I think it'd be very popular!
Thank you again.
-
Thank you Donna, glad to know that I am not completely mad!!
As for the fact they have done a great job with title and alt tags, anchor texts, I agree, but you know what? That's another realm where I became paranoid for, the so called "over-optimization"... We used to have perfectly optimized titles, descriptions, H1,s ALTs, anchor text, etc... the whole enchilada perfectly optimized, then we began to lose rankings for an unknown reason (Panda? Over-optimization? Too many pages? What else?), and I began becoming paranoid about everything, so we started "de-optimizing" here and there, etc... Here is additional proof that when things are NOT clear, we all become paranoid and lose control on everything.
I may also add that some of the blame should be probably given to the SEO industry that has spread a lot of fear about all this stuff, without giving an absolute "quantification" of what means "too much optimization", or... too much duplicate content, or too much thing content, or too much bad links, etc... how much is "too much"? That's the question for which I am afraid there's not easy answer, but maybe they scared us too much about all this.
Thank you again, and please, let me know if you have any more ideas.
-
I agree with your observations. I don't see why you'd suffer a Panda penalty and your competition wouldn't.
My only other observation (again, not Panda related) is 8notes have made much more extensive use of link title and alt tags to reinforce their keywords and subject matter.
-
Oh, I also forgot to make you note that when we have more than one version for the same title and same instrument, we add a canonical to the first version.
For example, both items below:
http://www.virtualsheetmusic.com/score/HL-170563.html
http://www.virtualsheetmusic.com/score/HL-119157.html
Are canonicalized to this item:
http://www.virtualsheetmusic.com/score/HL-180308.html
Thanks
-
Thank you Donna!
Yes, I am aware we have much more segmentation, but why in the heck that should be bad? And, more importantly, why that should be worse than having much more actual duplicate content than they have if you consider they never use canonicals nor noindex to avoid duplicate issues.
Take for example one of their instrumental pages like this one:
https://www.8notes.com/violin/sheet_music/?orderby=5d
You can order the results by title, artist, level, etc... they don't even care to have duplicate issues by canonicalizing the URL when parameters are added to it. What do you think may be worse for Panda? I am just asking, I'd really like to understand what can be worse.
They have similar issues with their product pages, if you click on their tabs at the top, parameters get added to the URLs but no canonical is present there. We do have canonicals instead, as by Google's manual.
As for having duplicate titles (because of different instrumental versions), yes, that's something we have tackled in the past several times, tried to remove the duplicates (noindexed) but didn't help. I mean, we didn't notice any change by doing that waiting several months after the change. We just got much lower traffic because of that, but nothing positive.
Also, have a look at our competitor how many different version they have for some of the most popular titles:
Bach's Air on G: https://www.8notes.com/scores/air_on_the_g_string_bach_johann_sebastian.asp
Bach's Minuet: https://www.8notes.com/scores/minuet_bach_johann_sebastian.asp
Beethoven's Fur Elise: https://www.8notes.com/scores/fur_elise_beethoven_ludwig_van.asp
Beethoven's Ode to Joy: https://www.8notes.com/scores/ode_to_joy_(9th_symphony)_beethoven_ludwig_van.asp
Now, I understand we may have many more titles than them, our catalog is much bigger than theirs, but still they have similar issues, right? If so, why they have a very privileged spot in SERPs compared to us? I am sorry to sound like a broken record, but I am still not convinced the problem here is all this...
If you have the chance, I am eager to know your after thoughts... thank you again very much for your help and time Donna. Appreciated very much.
-
Yes, I see what you've done with genres, specials, etc. That looks good.
If I compare you to 8notes, you've got a lot more segmentation when it comes to instruments and those pages are NOT noindexed.
For example, you have 29 different versions of the "Dust in the Wind" sheet music page, all very similar. Here are a few:
- http://www.virtualsheetmusic.com/score/HL-328374.html (Dust in the Wind sheet music for violin)
- http://www.virtualsheetmusic.com/score/HL-328387.html (Dust in the Wind sheet music for trumpet solo)
- http://www.virtualsheetmusic.com/score/HL-301822.html (Dust in the Wind sheet music for choir and piano)
- http://www.virtualsheetmusic.com/score/HL-170563.html (Dust in the Wind sheet music for guitar (chords))
- http://www.virtualsheetmusic.com/score/HL-26501.html (Dust in the Wind sheet music for piano solo)
- http://www.virtualsheetmusic.com/score/HL-119157.html (Dust in the Wind sheet music for piano solo V2)
NOT noindexed doesn't mean they're getting indexed by Google. When I did a site command for http://www.virtualsheetmusic.com/score/HL-328387.html (Dust in the Wind sheet music for trumpet solo), it wasn't returned as a search result. When I searched for ""dust in the wind" trumpet solo virtualsheetmusic" it was not returned as a search result.
So maybe you need to consider noindexing all the instrument variations as well, but still offer them up on the site for visitors. I'd check analytics to see if anyone's landing on those pages from search. As I said earlier, I can understand why you'd want them indexed but if they're causing you more harm than good, you might have to balance that out.
I love a challenge but that's the best I can come up with Fabrizo.
-
Thank you Donna for your reply.
Well, I see what you mean, but if you look how those drill-downs are handled on our site, all pages generated dynamically that way are excluded by robots.txt. That's why I am puzzled to see the competitor's website having a similar kind of browsing without worrying about duplicate issues. Also, I apply any possible rule to reduce duplicate content as much as I can even for those few indexed pages such as the use of canonicals (when parameters are present in the URLs) or the use of rel="prev" or rel="next" for paginated content.
Please, let me know if that's what you meant or I am missing anything here.
Thank you again very much!
-
"I'd really like to know from you if you see anything on my website that could trigger a "Panda" kind of penalization, compared to my mentioned competitor above (8nots.com)."
_Key phrase being "compared to my mentioned competitor". Cause yes, I can see things that might trigger a Panda penalization. You have a lot of overlapping / duplicate content but so do your competitors. _
The only thing that comes to mind is if there's a threshold you're exceeding that your competitors aren't by virtue of the fact that you many (and more) ways to tag/filter your content, for example, genres, specials, and ensembles.
-
Yes, I agree with you, I don't see much logic beyond that. Of course, if they also count images, we have hundreds of thousands... we are a pretty big website, what do you expect, right?
My website situation makes people "scratch their heads", all the time...
I'd really like to know from you if you see anything on my website that could trigger a "Panda" kind of of penalization, compared to my mentioned competitor above (8nots.com). So far, no one on this thread has given me any hints on that.
Thank you again for your insights, I appreciated it very much.
-
I wonder if "indexed URLs" is an accurate label. I looked at the URLs Majestic found, and a significant number are redirected files, images, and mp3s.
You have a perplexing problem Fabrizo...
-
Donna, as a side note, I have no idea where Majestic pulled out over 944,000 indexed pages for our website. By spidering it with Screaming Frog we couldn't crawl more than 387,726 pages... unless they crawled all links dynamically generated by our internal search engine, which by the way, is blocked by the robots.txt file, therefore all those dynamic pages should be not counted.
Also, on the actual Google index, if you use the site: command, you'll see that Google has indexed just 123,000 pages from our site (because most of them are canonicalized), whereas you'll see over 545,000 for 8notes.com.
Actual data seems to be a little different...
-
Thank you Julie for your posting and for participating in this discussion.
Well, what you say might be true, but being an algorithmic penalization that shouldn't really happen... unless the system is flawed in some way to catch the wrong guys (every time?)
Also, thin and duplicate content is so much more obvious and noticeable on our competitors that makes me completely mad trying to find a logical explanation of why me and not them!
Unless, Panda or other similar "quality" updates are looking now for something else nobody has clearly understood yet...
-
I think it's like the Highway Patrol.
Everyone speeds, but everyone doesn't get caught. You point to a similar web page that hasn't been penalized, but there are many who have.
-
Thank you Kristen,
I have just put down a plan to re-architect our website that way and create "sub-categories" in the same way our competitor has done to push-up the main category pages as well (according to the "soloing" technique).
As I wrote yesterday below, this is also something clearly NOT related to Panda... just another needed tweak to the site.
Thank you again, appreciated your help!
-
Thank you Donna! Yes, I am aware of our different back link profile. We are a commercial website, therefore we have many backlinks from hundreds of affiliates... and that could cause issues, I am aware of that. I have worked a lot with my affiliate to put nofollow links where necessary, and to not pass page rank as much as possible... But again, we are talking about issues NOT Panda related... right?
So... again, this can't explain why the first Panda in 2011 as well as the last quality update released in June (was that really Panda?) has hit us hard. I am getting convinced that it is not Panda the beast hitting us once in a while, but something else... my point is: I could be under "several" penalties, ok, I get that... I could be under some Panda penalization, or other quality penalization, and maybe Penguin to some extent (I could never find a clear relation between my traffic loss and the release of Penguin updates though)... but if I am really in the eye of Panda when that happen, back to my original question, why my competitor has never been touched by the white & black bear when its content should be much more prone to Panda than mine? That's the whole point of my conversation here and the answer I am trying to find. I am trying to find a logical explanation of why my traffic dropped with the release of Panda updates, whereas my competitor wasn't touched at all.
Thank you again for your help, appreciated!
-
You're right. Your site and 8notes seem to be guilty of the same practices, assuming there's anything wrong with them, and yet it is ranking well. Although according to MajesticSEO, it has half the pages you have indexed (520,439 vs 944,432).
Your link profiles are significantly different. Again according to Majestic, you have way more backlinks (649,076 vs 234,122) but from half as many referring domains, IPs and subnets. You have 1/10th of the educational backlinks of 8notes. And the majority of your backlinks, roughly 55%, are nofollow whereas 90% of 8notes are the opposite (follow). 8notes seems to have more deep links as well.
Maybe it's worth looking a little more closely at your link profile?
8notes is also https. That might also have a bearing given you're both ecommerce.
-
In any case, I created this thread to discuss about Panda and its "possible" and "not-possible" implications... so, the discussion is still be open.
In the meantime I wait with hope for an answer to my questions above from Kristen (thank you again Kristen!), I'd like to get back to the topic: Panda. It is clear to me that my situation could be improved as Kristen has suggested above, but it is also clear that if so, that's nothing to do with Panda, isn't it? That's just about "content consolidation" and "topical relevance".
Then, back to my original discussion topic, what about the classical Panda issues such as "thin content", or "duplicate content"? As I wrote above, my competitor has plenty of that kind of content, but doesn't seem to have been touched by Panda whatsoever. So... what's the deal with Panda then? And in my particular niche, should I worry more about "topical relevance" and keep optimizing my site under other aspects (usability, user intent, etc), and stop worrying to much about thin and duplicate content?
If you were me, what would you do considering my competitor's evidence? How many other site owners like myself have become "paranoid" about Panda (wasting tons of time, resources and money) and have instead lost focus on other (probably more important!) issues such as topical relevance, content organization, usability, user intent, etc.
More thoughts on that?
-
This is a very good answer, thank you Kristen. The more I look at the "site structure" of my competitor compared to ours, the more I realize we need to work on that.
I have also started to think about the so called "siloing technique" Bruce Clay introduced a few years ago, and it looks like 8notes.com has done a very good job to follow that kind of concept, whereas we are probably "spreading" too much of our juice around thousands of different pages and categories... what are your thoughts on that?
Just a thought about the fact I have put those pages to be no indexed in the robots.txt file: If you look at those pages, you see they are generated dynamically from our internal search engine. And as you can see, you can filter results by clicking the filters on the left side of the page... which is a great thing for users, but can be problematic for search engines. Right? So.. that's why I decided to simply no-robot those pages, to avoid any possible indexing and crawling issues. So... how would you suggest tackling that problem? My first idea would be to create "static pages" for those dynamic pages linked from the category pages, and then block via no-robot the links of the filters on the left side... do you have any different ideas?
Thank you again for your help! Super-appreciated!
-
Thank you Kristen for your kind reply.
Yes, of course, I have considered that a thousand times. Do you really think that could cause so much trouble to make most of our rankings slip over the 10th or 20th page of the SERPS? Those pages you have mentioned on our website, are actually blocked by robots.txt, with the exception of the first page of course. My concern is about those first pages that should be able to rank anyway... unless you tell me that the contextual weight of the "subsequent" pages could play a role to "boost" the first page in some way if Google can spider and index them... but then, I'd be concerned with "too much similar or thin content" because by doing what our competitors are doing, I'd create thousands of additional pages with inside pretty much the same content (lists) organized in different ways... you see what I mean? Of course it seems to work for our competitor, but hence the contradiction and absurdity I was talking about above with the Panda algorithm: shouldn't all those thousands of extra and similar pages be bamboo for Panda?
I hope to have not confused you... I am just trying to find the elephant here that is causing the problem...
Thank you for your help!
-
Thank you Donna for your reply, but that's exactly why I posted my concerns here: How do I know if a page or a set of pages are "causing" me to not rank? That's the purpose of this discussion... how do I actually know that any action of that kind, by "nuking" pages, will be causing me more benefit than damage?
A couple of years ago, I thought to be under a Panda penalty, and I started to remove from the index many of our product pages that I thought were not bringing us traffic or had low user engagement... well, the only result we had was a steady decline of traffic because of the removed pages, that's all. I put all pages back after 6 months because we would have died miserably. After 6 months I concluded that we were NOT under a Panda penalty. IN fact, I put back all pages, traffic got back and we started to rank better than before (fortunate event??)
Also, if I we are under a Panda or similar penalty, why do we still rank well for some keywords? And, back again to square one: Why should we think to be under a Panda penalization if our content is actually less thin, less duplicate, and better handled with canonicals, noindex tages, etc, in order to cope with any possible Panda penalty than our well-ranked competitors??!
-
The only thing I can think to suggest is to look at how much inbound search traffic you are receiving on your category pages (e.g. genres, instruments, skill levels, exclusive, specials, etc.) to assess whether any of those could be noindexed. I understand why you'd want them indexed, but if they're causing you to not rank at all, then you might have to balance that out.
-
Thank you Julie, appreciated!! I really don't understand why most of the times our website content is buried in the search results, whereas "crappy" websites are shown prominently before us... do you call that "quality" big G??!!
Thank you again for your kind words
-
I feel bad for you. I don't have any answers, but I'm a singer, and your website is excellent. This is not an example of Google rewards quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drop in Rankings
UPDATE* I temporarily removed the images from the page to check if rankings would come back and they did. So Google made a recent update in the Netherlands where they look at your page for adult content (pictures). For us it was breast augmentations, before and after pics. END Hey Guys, We have a site for plastic surgery in the Netherlands. We ranked on number 2 if someone searched for breast augmentation. But sinds 31 of januari our ranking dropped outside the top 50. I did not get a warning or an other message in Search Console. We have a lot of pictures on the website displaying before and after pictures. So you see breasts. Does anybody know if google rolled out an update where they look at photo's with audult content? We didnt do any linkbuilding or other black hat stuff. Kind regards, Ruud
Intermediate & Advanced SEO | | equipezorgbedrijven0 -
Reclaiming Ranking positions in Google
We have a website we are working on that was ranking well in Google but since having a hosting upgrade has completely dropped in rankings. When a hosting upgrade was made, the developer added an incorrect robots.txt file that restricted the site from being found, hence resulting in lost rankings. We have since sorted out that issue so the robots.txt is OK. However, ranking results have yet to be reclaimed. We are unsure why these rankings haven't rebounded back, as it has been a while now. The site is https://www.brightonpanelworks.com.au. We have since also attempted to add a sitemap however to help the site be better crawled and to regain rankings, however, it appears that sitemap generators are having problems creating a sitemap for this site and we are not sure why. And we are not sure whether this may relate to why Google has not picked up on pages and ranking results have not be restored. If you have any ideas as to how we can reclaim rankings to the strong positions they were in previously, that would be much appreciated. We believe we may be missing something here that is not allowing webpages to be picked up and ranked by Google.
Intermediate & Advanced SEO | | Gavo0 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
How to rank product pages?
Hi guys, Please advice me on something improving my product pages ranking. We are doing well for head terms, categories but not ranking for product pages. We have issues with product pages which I am think is hard to tackle. For instance we have duplicate products (different colors), duplicate content internally (colors) and from manufacturer websites. Product pages linked from sub-category i.e. Home > Category > Sub-Category (20 per page) using pagination for next 20 and so on. Product pages linked internally via widgets that says other Similar products, featured products etc. Another issue with our product pages is that we are using third party reviews platform and whenever users add reviews to product pages this platform creates an hyperlink to different anchors which is not relevant to product. Example - http://goo.gl/NUG652 Can somebody please give some advice on how to improve rankings for product pages. writing unique content for thousands of pages is not possible. Even our competitor not writing unique content.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Ranking
Hi All, I have been working on a car insurance site which targets US market. we have been doing all types of links to the site like - good related directory links, related blogpost and reviews. the site is updated daily with a fresh unique content. what more is needed to get it to the first page. we are currently ranking in the range of 40 - 50 on the page 4 or page 5. we have also recently done a redesign with a new cms, done a good press release but not seeing much of changes. please if anyone in the same industry or with some experience can suggest us. we are using all white hat seo & I can understand the amount of competition in this niche. Thanks
Intermediate & Advanced SEO | | Markyseo0 -
Suddenr Ranking Drop
Hello all . Im experiencing a sudden ranking drop for one of my websites . Keyword = Green Tea Search Engine = google.lk URL = www.eswaran.com/greenTea.html This page was at 7th on 1st page . Since coupe of weeks back its moving backwards . I didn't even do any link building for this page other than doing little changes to titles tags . What would be the reason for such a issue ?
Intermediate & Advanced SEO | | Osanda0 -
Dupicated Site Issues?
We are launching a new site for the Australian market and the URL will just be siteAU.com. Currently the tech team (before we came on board) has it setup with almost exactly the same content (including the site css/nav/structure etc). Some product page content is slightly different, and category pages have different product orders, plus there are location pages that are specific to AU, but otherwise it's the same. The original site: site.ca has been around for 6+ years, with several thousand pages and solid organic ranking (though the last few months have dropped ) Will the new AU site create issues for the original domain? We also have siteUSA.com which follows the same logic and has been live for a while.
Intermediate & Advanced SEO | | BMGSEO0 -
Key page of site not ranking at all
Our site has the largest selection of dog clothes on the Internet. We're been (every so slowly) creeping up in the rankings for the "dog clothes" term, but for some reason only rank for our home page. Even though the home page (and every page on the domain) has links pointing to our specific Dog Clothes page, that page doesn't even rank anywhere when searching Google with "dog clothes site:baxterboo.com". http://www.google.com/webhp?source=hp&q=dog+clothes+site:baxterboo.com&#sclient=psy&hl=en&site=webhp&source=hp&q=dog+clothes+site:baxterboo.com&btnG=Google+Search&aq=f&aqi=&aql=&oq=dog+clothes+site:baxterboo.com&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=f4efcaa1b8c328f Pages 2+ of product results from that page rank, but not the base page. It's not excluded in robots.txt, All on site links to that page use the same URL. That page is loaded with more text that includes the keywords. I don't believe there's duplicated content. What am I missing? Has the page somehow been penalized?
Intermediate & Advanced SEO | | BBPets0