I think this website has been hit by Panda, but I would appreciate your opinion
-
I've been asked to check a possible SEO problem with a website, that has been loosing organic traffic during more than 2 years. I have attached a screen capture from analytics, showing how the organic traffic impact.
This website publishes over 15 articles per week, and 12 of them are news with less than 150 words. I think that maybe Panda is hitting the website because of these practice. You can check the website: crazyminds.es
I would like to know your opinion about the cause of this lost of organic traffic.
On January, 21st 2013 they changed the website design, but the lost of traffic seems to have started before that date.
If panda is hitting the website, what should be the best way to correct this situation? They have began now to write news with more than 200 words, but what happens with the old news? Maybe a no-index tag? blocked by robots? how should they manage those?
Thank you!
-
Here's more info on blocking javascript and the effects it can have on ranking:
https://yoast.com/google-panda-robots-css-js/
It's certainly possible that it's affecting the site, but there could be other issues as well.
Regarding the news articles, the number of words is not what matters most but whether these are useful to people. If most of these have very low user engagement, then they could possibly cause Panda to affect the whole site.
-
Hi Danny,
google search console shows this:
- 14527 indexed pages
- 19 resources blocked (js files, wp-includes is being blocked by robots.txt)
Errors, desktop:
- 3 server errors
- 1 soft 404 error
- 51 404 errors
Errors, news:
4 errors: article fragmented and article too short.sitemap.xml (146 warnings)
14439 urls, sent
13886 urls, indexed
No security problems.
No more issues, I would like to know whay you say it comes from a technical problem... I think there must be something that is making you think that way. I would appreciate your opinion on this issue. Thank you so much!
-
Hi Marie,
Thanks for your anwser.
The news they are publising are being created by them. Not just a copy and paste from other sites, created 100% by them, but those news are just 3 or 4 parragraphs long.I have checked that there are 19 resources blocked in 3185 pages (js files as you said). I can tell them to let googlebot explore the wp-includes directory, to avoid this issue. Do you really think this can cause such a big effect?
And finally, robots.txt file has also a Crawl-delay:30. Search console show that as a warning. What do you think, should we remove that? It shouldn't be an issue... but never knows...
Thank you so much!
-
I just noticed one other thing that could potentially be an issue. You've got wp/includes blocked by robots.txt. If this is where your javascript lies, then it's a good idea to actually allow Google to crawl this directory.
-
I took a quick look at this domain.
While there does seem to be a few duplicate content issues my hunch is that it is more likely to be due to some technical issues.
I would first go take a look in Google Search Console and check whether Google is reporting any crawl, sitemap or robots.txt issues.
If that doesn't show any issues then you may need to get a full technical audit by a reputable SEO
-
This is a question that would take a few hours to answer properly. Here are my brief thoughts.
What you've described with the short news articles sounds like Panda fodder for sure to me. If they're republishing news stories without adding value of their own then this is not a good idea. Too many of these can cause Panda to demote the entire site (not just the news articles.) In most cases I'd noindex content like this. Sometimes you can consolidate content into a thorough piece of content, but usually either noindexing or removing the content is best.
Also, it's not the number of words that matters. If they decided to write 1000 word articles that were essentially repeating what could be found on other news sites this would also be thin content. If they produce content that they want to have in the Google index, then Google has to have a reason to send readers to their content rather than to the hundreds of other people who wrote about the same news story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are the technical details (touchpoints) of a website gathered by Google?
Hi all, Google crawls all the webpages and gathers content to index and ranking. Beside this general info, what are the all other possible technical details Google will be gathered about a website to rank or penalise or optimise the website in SERP? Like IP address, DNS server, etc.......Please share your knowledge and ideas on this. Thanks
Algorithm Updates | | vtmoz0 -
Website's server IP address is redirected to blog by mistake; does Google responds?
Hi all, Our website's server IP address is set to be redirected to our blog by mistake and it stayed same for months. Is there any way Google recognises it and how it responds if so? Thanks
Algorithm Updates | | vtmoz1 -
Hit by an unnamed Google update on November 30th - Still suffering
Hi, Moz Community. Just decided to sign up for a free trial because I'm absolutely at my wits end here. Here's my site: cheapgamesguru.com I run a small PC gaming blog monetized by affiliate marketing. I do all the writing, SEO, etc. myself. The content I write is, from what I can tell, fully complying with Google's guidelines, and in and of itself is pretty informative and high-quality. My site was started in December of 2015, and it was doing very well for a good 10 or 11 months - until late November of 2016. Then something happened. My traffic started plummeting - I went from getting nearly 300 organic users a day (Not sessions - actual unique users) to 80, then 40, and now I'm lucky to get over 15 a day. I do not do ANY black hat SEO whatsoever. I have not taken part in any shady link building schemes, nor do I try to trick Google in any way. I just write good content, do good keyword research (Targeting only low-hanging fruit and low-difficulty keywords using KWFinder), and do my best to provide a good user experience. I run no ads on my site. Glenn Gabe wrote about a potential Google update on November 29th, but the stuff he said in his article doesn't seem to affect me - my mobile site is perfectly fine, according to Google's own metrics and testing tools. Here's the article in question: http://www.gsqi.com/marketing-blog/november-30-2016-google-algorithm-update/ At first, I thought it was possible that this was a result of my competitors simply doing far better than me - but that doesn't seem to be the case, as their rankings did not actually move - mine simply pummeted. And many of their sites are far worse than mine in terms of grammar, spelling, and site speed. I understand backlinks are important, by the way, but I really don't think that's why my site was hit. Many competitors of mine have little to no backlinks and are doing great, and it would also not make much sense for Google to hit an otherwise great site just because they have few backlinks. A friend of mine has reached out to Glenn Gabe himself to see if he can get his input on my site, but he's had a busy schedule and hasn't gotten a chance to take a look yet. I recently obtained a backlink from a highly relevant DA 65 site (About a month ago, same niche as my site), and it now shows up in Search Console and Ahrefs - but it hasn't affected rankings whatsoever. Important Note: I'm not only just ranking poorly for stuff, I'm ranking in position 100-150+ for many low-competition keywords. I have no idea why that is happening - is my site THAT bad, that my content deserves to be ranking on page 15 or lower? Sorry for the long question. I'm struggling here, and just wanted to give as much information as possible. I would really appreciate any input you guys can give me - if any SEO experts want to turn my site into a case study and work with me to improve things, I'd also be open to that 😉 I kid, of course - I know you guys are all busy. Thanks! P.S. I've attached a picture of my SEMRush graph, for reference, as well. mhgSw
Algorithm Updates | | polycountz0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Whats the best method to tackle website traffic drop?
ON 14th November following a DNS error (which seems to have been a google error as local servers were not effected and it happened to many people everywhere) my traffic started dropping. Within 5 days it was down 50%. In a panic to resolve the situation I thought it was because i was on a shared host and moved to VPS which was a disaster. Ive had server errors since. After desperately looking everywhere for news on the DNS error and whether there as algorithm change etc I decided i should have a look at my site @ www.mutantspace.com and see if there were any internal issues. In Google Webmaster forum a helpful moderator suggested that I do 2 things: 1. Deal with the fact that I had keyword stuffed my <alt tags="">. Basically i run an arts blog so have 8 - 10 images per post and was putting the same text in each one i.e [artists name][artform][name of artwork]. I stupidly didnt realise what i had been doing and have since been deleting my alt tags for every image except one per post. However i have 17,000 images so its going to take a while. </alt> 2. She also linked me to https://ahrefs.com/site-explorer/overview/subdomains/http%253A%252F%252Fwww.mutantspace.com%252F and wondered why i had such volatile inbound links. I dont know why. And cant figure it out As far as everything else goes I dont know what I could be doing wrong to deserve penalty - if it is a penalty. I dont back link so all my links are natural (from artists, galleries, art blogs, tumblrs, etc) I dont sell advertising (yet anyway) Having said that ive been told i have too many links on each page (i run a wordpress site and so have categories, etc) so im wondering if i should nofollow my categories? In short im wondering what advice anyone has on doing a systematic shake up of my site. Im currently doing the following: 1. deleting most of the <alt tags="">on my posts. Ive got back as far as 2012 and will keep going til theyre all done.</alt> 2. Redirecting all crawl errors 3. No Following more outbound links and links to social networks 4. Checking all inbound links to see if there is an suspicious domains 5. Sorting out the fact that ive had numerous server errors for the last 2 weeks (would that affect SERPS?) Is there anything else i can do? Should do? much appreciated
Algorithm Updates | | mutant20080 -
My Website is not stable
My website is daily showing different position on google.com on all its keywords like one day first page on all keywords next day 3rd page on all keywords for the last one month. Is this a Google Dance ?? what can be its period ? and what is the solution to handle it ??
Algorithm Updates | | mnkpso0 -
Panda / Penguin Behavior ? Recovery?
Our site took a major fall on March 23rd, ie Panda 3.4 and then another smaller one on April 24th, ie Penguin. I have posted a few times in here trying get help on what items to focus on. Been doing this for 13 years, white hat, never chased algos but of course learned as I went. As soon as the fall hit one expert said it was links, which I kinda doubted because we never went after them but we have some but only a handful in comparison to really good authorative links. I concentrated on cleaning up duplicate content due to tags in a blog that only had 7 posts (an add on section to the site) then focuses efforts on just going through and making content better. Had other overlapping content that I would guess would pass inspection but I cleaned it up. After 6 weeks no movement back up, another expert here said yes, he saw some bad links so I should check it out. So back to focusing on links, I actually run a report and discover questionable links, and successfully get about 25 removed. Low numbers but we have only about 50 that were questionable. No contact info on the other directories so I guess we are stuck. Here is where I just go in circles... When our site fell on March 23rd we had 13 of our main pages still ranking at number 1 and 2 on each keyword phrase. Penguin hit and they fell about 10 spots. EXCEPT, one... This one keyword phrase and page stayed on top and ranked at #1 throught he storm. (finally fell to #4 but still remains up there). The whole site is down 90%, we only have 3 fair keyword phrases really ranking out of 250. The mystery is that the keyword phrase that was ranking was the one that supposedly had way over the % of anchor text, 7% of our links go to that page. The other pages that fell on Penguin had no pages linking back. I have been adding blog posts to our site, I post one an in a few days it gets indexed, have one of those ranking at #2 for the keyword, moved up from #4 a week after posting it in the blog. (google searches shows 80K) Just seems like the site should bounce back if new content is able to rank, why not the old? Did other people hit by Panda and Penguin see a sitewide fall or are they still ranking for some terms? I would love to see some discusson on success stories of bouncing back after Panda and Penguin. I see the WP success story but that was pretty sudden after it was brought to Google's attention. Looking for that small business that fixed something and saw improvement. Give me hope here please.
Algorithm Updates | | Force70 -
I think Panda was a conspiracy.
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy. Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase." And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix. Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda. So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well. Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query. I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place. What do you think?
Algorithm Updates | | MarieHaynes3