Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
-
Hello, really hoped for your feedback here.
So my site is http://www.1stwebdesigner.com
I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014.
Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more.
Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo.
What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years.
Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years.
Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw
-
You're very welcome.
OK, so if we can see that the drop happened May 20, 2014, then there is a high possibility that Panda has affected your site. We can't always say that with certainty though. I remember the week following that date, I had three separate sites that came to me saying that they had been horribly hit by Panda. One of the sites had done a redesign and launched May 19. The redesign changed all of the title tags and the urls, so it was not Panda's fault that they dropped in rankings. Another site had also redesigned and the developer forgot to put the Google analytics tracking code on the site. So, rankings didn't change but it looked like traffic plummeted. And the final site had accidentally noindexed 90% of their site. I'm not kidding!
Still, a big drop on May 20 (give or take one day) means that there are on-site quality issues to address. While it's never wrong to audit your backlinks I'd spend more time reviewing on page factors. When Panda first came out I spent a lot of time looking at thin and duplicate content. While that stuff, especially thin content is still important, there are a LOT more factors that go into Panda. I firmly believe now that Panda is Google's way of figuring out what users prefer seeing. As such, I focus on three things:
-
Improving technical SEO.
-
Removing or vastly improving thin content. (And yes, redirecting to an appropriate page or category page is fine.)
-
Figuring out how the site can be the best of its kind for users. This involves looking at competitors' sites with a non-biased eye and also looking at analytics data to see if there are pages in the index that users are not happy with (i.e. they consistently bounce or spend very little time on the page compared to others.)
Here is more on my approach to Panda:
https://moz.com/blog/have-we-been-wrong-about-panda-all-along
Best of luck!
Marie
-
-
Wow, definitely helps, Marie! Didn't expect to receive a helpful hand from such an SEO expert as you Thank you so much!
First of all, on Webmasters there is no manual action.
Thank you for bringing attention to on-site issues, that's actually what I am doing now, over 8 years we had more than 2000 published articles and right now I am cleaning them out after seeing that 20% of articles receive 80% of the traffic.
By any chance can you suggest the best way to deal with deleted posts? Right now I am redirecting them all with 301 Redirect to related similar post, or if none is suitable I redirect to home page, trying not to leave 404. What would you recommend doing?
Really good points, well, I never syndicated posts, lots of link building included submitting it to related web design news websites (pligg, digg voting type sites or where news appeared in sidebar).
I did exchange blogroll links maybe with 10 max friends sites, but nothing massive.
And regarding May 20, it indeed happened then. Comparing Tuesday, May 20 to Monday, May 19.
Actually after research..and following your disavow article how to work with massive amounts of links in spreadsheets and trying out Detox tool, I found, there are certainly links i would want to disavow, but it's not as significant as I thought. I really think, that what you are saying is right - that simply Google stopped counting some links and that' why rankings did drop.
Action plan is clear at least, first focus on fixing everything possible on-site before touching Disavow tool. Really, really appreciate your helpful reply and taking time to give such an insightful feedback to simple guy like me
Thank you again, Marie!
-
First off, thank you Peter for the recommendations of my articles.
It looks like there are a number of issues with your site. I wouldn't focus so much on just one algorithm, but rather, spend time looking at the entire site and its issues.
What phrases are you trying to rank for? Your title tag on your home page says, "The Community of Web Design Professionals" which I am guessing is not your money keyword. Keywords in the title tag (without stuffing), especially at the front of the tag are very important.
Next, I would get rid of the huge full page popup. It could possibly cause the page layout algo to affect your rankings. It can also cause people to quickly bounce away and look for another site which can be a signal to Google that users don't enjoy your site.
The Google pagespeed insights tool gives you a very low score. Improving on pagespeed could make a significant difference.
You've got 1490 pages in Google's index. Do you really have 1490 high quality pages? Is it possible that there is some content indexed that doesn't need to be? For example, when I take text from this article: http://www.1stwebdesigner.com/find-qualified-designer/ and search for it on quotes on Google I see 27 results. The only pages that you want from your site in Google's index are high quality, unique and compelling articles and resources. With that said, if you wrote this article and other people copied it, I wouldn't be worried. But, if you purposely syndicated it, then it could be an issue.
It's hard to say whether links are the issue. It doesn't seem like any of your drops coincided with a Penguin date so I am guessing that Penguin is not suppressing you. (Can't say for sure without having a deep look). But, it is possible that you were previously getting some benefit from self made links that now, Google can determine are self made as opposed to truly earned and they may be discounting those links. You mentioned that you had created a lot of links on your own. As Google gets better and better at figuring out which links are truly earned as opposed to self made, it may be that you've lost significant pagerank that used to come through these links.
With that said, have you checked Google Search Console --> Search Traffic --> Manual actions to make sure there is no manual action there? I see a lot of blogroll links and sometimes if there has been excessive link exchanging going on a site can get a manual action.
Now, back to the algo stuff. Can you tell via Google analytics if the big drop in May happened on May 20, 2014? If not, it's not Panda. If it is exactly on this date though, then I'd be spending more time looking at on-site issues than links.
Hope that helps!
Marie
-
@Peter - I read everything and was in the process of doing the audit, but Google Sheets keeps crashing, site has around 1 million backlinks and formulas just keep crashing when working with even 100,000 links from only webmaster. Any tips?
-
Thanks Peter, on my way then, excellent help, really appreciate it..will let you know how it goes, if you will be interested
-
So you need to get all links from:
- SearchConsole
- Ahrefs
- Moz OSE
- Majestic - you need to verify your domain and then you can download fresh list for free. For historical you need to paying account.
- WebMeUp - free with some limitations. Paid version also available
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://moz.com/blog/have-we-been-wrong-about-panda-all-alongSo let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://moz.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos. -
-
Thank you so much Peter for your helpful reply. PSD2HTML is a new for me, already read about WPMU crazy moves and disavow guide..started working on it yesterday..just friend of mine who works at Ahrefs still convinced that I shouldnt go Disavow path.
Alright, working on audit then, thank you for links already!
What other tools you would recommend, if you say it isnt enough with Ahrefs?
-
-
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://moz.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.About disavowing. I strongly recommend to read Marie Haynes articles:
https://moz.com/blog/guide-to-googles-disavow-tool
https://moz.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://moz.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://moz.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Stuck on the 2nd page of google! Help
I run a McAfee Technical Support website. I has been 2.3 months since I have been practicing seo on it. It was slick until it appeared on the second page of google. But now it doesnt rank up as it's frozen. Can i get any advices and suggestions for my website to break the 2nd page cage. My website:-** mcafee.com/activate**
Intermediate & Advanced SEO | | six_figures0 -
Merging Two Sites: Need Help!
I have two existing e-commerce sites. The older one, is built on the Yahoo platform and had limitations as far as user experience. The new site is built on the Magento 2 platform. We are going to be using SLI search for our search and navigation on the new Magento platform. SLI wants us to 301 all of our categories to the hosted category pages they will create, that will have a URL structure akin to site.com/shop/category-name.html. The issue is: If I want to merge the two sites, I will have to do a 301 to the category pages of the new site, which will have 301s going to the category pages hosted by SLI. I hope this makes sense! The way I see it, I have two options: Do a 301 from the old domain to categories of the new domain, and have the new domain's categories 301 to the SLI categories; or, I can do my 301s directly to the SLI hosted category pages. The downside of #1 is that I will be doing two 301s, and I know I will lose more link juice as a result. The upside of #1, is that if decide not to use SLI in the future, it is one less thing to worry about. The downside of #2, is that I will be directing all the category pages from the old site to a site I do not ultimately control. I appreciate any feedback.
Intermediate & Advanced SEO | | KH20171 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Sitemap Issue - vol 2
Hello everyone! I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, . I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it! Thanks
Intermediate & Advanced SEO | | PremioOscar0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640 -
Duplicate Content Help
seomoz tool gives me back duplicate content on both these URL's http://www.mydomain.com/football-teams/ http://www.mydomain.com/football-teams/index.php I want to use http://www.mydomain.com/football-teams/ as this just look nice & clean. What would be best practice to fix this issue? Kind Regards Eddie
Intermediate & Advanced SEO | | Paul780 -
Help with canonical tag
hello- i got this recommendation <dl> <dt>Recommendation</dt> <dd>Add a canonical URL tag referencing this URL to the header of the page</dd> <dd>from my "report card" and i see also that i have a lot of issues with duplicate content but i really dont have any duplicate content on my site.</dd> <dd>the crawl has apparently marked every post in my blog as duplicate page content.</dd> <dd>and the "use canonical tag" suggestion keeps appearing as a fix to my problems.</dd> <dd>could you please help me with ------How do i create a canonical tag?</dd> <dd>is it just rel=canonical?</dd> <dd>and where do i put it?</dd> <dd>i should put it on every page right?</dd> <dd>or with CSS my webmaster could probably do it very quickly right?</dd> <dd>i get the basic concept behind rel=canonical but i cant say i fully understand it -</dd> <dd>i need some help with regard to how and where this tag should be placed.</dd> <dd>thanks,</dd> <dd>erik
Intermediate & Advanced SEO | | Ezpro9
</dd> <dd>.</dd> </dl>0