Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
-
Hello, really hoped for your feedback here.
So my site is http://www.1stwebdesigner.com
I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014.
Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more.
Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo.
What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years.
Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years.
Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw
-
You're very welcome.
OK, so if we can see that the drop happened May 20, 2014, then there is a high possibility that Panda has affected your site. We can't always say that with certainty though. I remember the week following that date, I had three separate sites that came to me saying that they had been horribly hit by Panda. One of the sites had done a redesign and launched May 19. The redesign changed all of the title tags and the urls, so it was not Panda's fault that they dropped in rankings. Another site had also redesigned and the developer forgot to put the Google analytics tracking code on the site. So, rankings didn't change but it looked like traffic plummeted. And the final site had accidentally noindexed 90% of their site. I'm not kidding!
Still, a big drop on May 20 (give or take one day) means that there are on-site quality issues to address. While it's never wrong to audit your backlinks I'd spend more time reviewing on page factors. When Panda first came out I spent a lot of time looking at thin and duplicate content. While that stuff, especially thin content is still important, there are a LOT more factors that go into Panda. I firmly believe now that Panda is Google's way of figuring out what users prefer seeing. As such, I focus on three things:
-
Improving technical SEO.
-
Removing or vastly improving thin content. (And yes, redirecting to an appropriate page or category page is fine.)
-
Figuring out how the site can be the best of its kind for users. This involves looking at competitors' sites with a non-biased eye and also looking at analytics data to see if there are pages in the index that users are not happy with (i.e. they consistently bounce or spend very little time on the page compared to others.)
Here is more on my approach to Panda:
https://moz.com/blog/have-we-been-wrong-about-panda-all-along
Best of luck!
Marie
-
-
Wow, definitely helps, Marie! Didn't expect to receive a helpful hand from such an SEO expert as you Thank you so much!
First of all, on Webmasters there is no manual action.
Thank you for bringing attention to on-site issues, that's actually what I am doing now, over 8 years we had more than 2000 published articles and right now I am cleaning them out after seeing that 20% of articles receive 80% of the traffic.
By any chance can you suggest the best way to deal with deleted posts? Right now I am redirecting them all with 301 Redirect to related similar post, or if none is suitable I redirect to home page, trying not to leave 404. What would you recommend doing?
Really good points, well, I never syndicated posts, lots of link building included submitting it to related web design news websites (pligg, digg voting type sites or where news appeared in sidebar).
I did exchange blogroll links maybe with 10 max friends sites, but nothing massive.
And regarding May 20, it indeed happened then. Comparing Tuesday, May 20 to Monday, May 19.
Actually after research..and following your disavow article how to work with massive amounts of links in spreadsheets and trying out Detox tool, I found, there are certainly links i would want to disavow, but it's not as significant as I thought. I really think, that what you are saying is right - that simply Google stopped counting some links and that' why rankings did drop.
Action plan is clear at least, first focus on fixing everything possible on-site before touching Disavow tool. Really, really appreciate your helpful reply and taking time to give such an insightful feedback to simple guy like me
Thank you again, Marie!
-
First off, thank you Peter for the recommendations of my articles.
It looks like there are a number of issues with your site. I wouldn't focus so much on just one algorithm, but rather, spend time looking at the entire site and its issues.
What phrases are you trying to rank for? Your title tag on your home page says, "The Community of Web Design Professionals" which I am guessing is not your money keyword. Keywords in the title tag (without stuffing), especially at the front of the tag are very important.
Next, I would get rid of the huge full page popup. It could possibly cause the page layout algo to affect your rankings. It can also cause people to quickly bounce away and look for another site which can be a signal to Google that users don't enjoy your site.
The Google pagespeed insights tool gives you a very low score. Improving on pagespeed could make a significant difference.
You've got 1490 pages in Google's index. Do you really have 1490 high quality pages? Is it possible that there is some content indexed that doesn't need to be? For example, when I take text from this article: http://www.1stwebdesigner.com/find-qualified-designer/ and search for it on quotes on Google I see 27 results. The only pages that you want from your site in Google's index are high quality, unique and compelling articles and resources. With that said, if you wrote this article and other people copied it, I wouldn't be worried. But, if you purposely syndicated it, then it could be an issue.
It's hard to say whether links are the issue. It doesn't seem like any of your drops coincided with a Penguin date so I am guessing that Penguin is not suppressing you. (Can't say for sure without having a deep look). But, it is possible that you were previously getting some benefit from self made links that now, Google can determine are self made as opposed to truly earned and they may be discounting those links. You mentioned that you had created a lot of links on your own. As Google gets better and better at figuring out which links are truly earned as opposed to self made, it may be that you've lost significant pagerank that used to come through these links.
With that said, have you checked Google Search Console --> Search Traffic --> Manual actions to make sure there is no manual action there? I see a lot of blogroll links and sometimes if there has been excessive link exchanging going on a site can get a manual action.
Now, back to the algo stuff. Can you tell via Google analytics if the big drop in May happened on May 20, 2014? If not, it's not Panda. If it is exactly on this date though, then I'd be spending more time looking at on-site issues than links.
Hope that helps!
Marie
-
@Peter - I read everything and was in the process of doing the audit, but Google Sheets keeps crashing, site has around 1 million backlinks and formulas just keep crashing when working with even 100,000 links from only webmaster. Any tips?
-
Thanks Peter, on my way then, excellent help, really appreciate it..will let you know how it goes, if you will be interested
-
So you need to get all links from:
- SearchConsole
- Ahrefs
- Moz OSE
- Majestic - you need to verify your domain and then you can download fresh list for free. For historical you need to paying account.
- WebMeUp - free with some limitations. Paid version also available
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://moz.com/blog/have-we-been-wrong-about-panda-all-alongSo let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://moz.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos. -
-
Thank you so much Peter for your helpful reply. PSD2HTML is a new for me, already read about WPMU crazy moves and disavow guide..started working on it yesterday..just friend of mine who works at Ahrefs still convinced that I shouldnt go Disavow path.
Alright, working on audit then, thank you for links already!
What other tools you would recommend, if you say it isnt enough with Ahrefs?
-
-
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://moz.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.About disavowing. I strongly recommend to read Marie Haynes articles:
https://moz.com/blog/guide-to-googles-disavow-tool
https://moz.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://moz.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://moz.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Rank 0 - Best way?
We are trying to create tables or bullet points on each of our pages summarising the content of the page and get it to rank on position 0 on Google. This technique worked for some searches but not all so we were wondering: Is it beneficial to add links or not ? Is there a keyword limit? We are on Magento 2 if that helps. Thanks James
Intermediate & Advanced SEO | | JamesDavison0 -
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Ratings Snippets Gone? ( Help! )
Hello We had good traffic from ratings ( stars ) . I have added Offer details in the rich snippets in various currencies - the snippet testing tool likes it , but for some reason the stars on my site have completely dissapeared and been gone for almost a week. I need the offer information in there for google shopping automatic updates and google told me that it's implemented correctly for the shopping part.. but I really don't know what to do about this. Any ideas why would be really appreciated. http://www.return2health.net/yeast-imbalance/threelac-candida-defence/ Thanks 🙂
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Can links be hidden?
I was wondering if anyone can help me with some advice on agency work. We have just employed a new SEO agency to conduct work on one of our websites. I took a look on OSE and GWT to see if we had any new links since the agency started working (1 month ago) but there's was nothing new. When l asked for an update as to what link building efforts had been completed last month, l was told they don't give out a list of links as it could compromise the agencies techniques. They told me that they use software to hide links form link aggregators so that our competitors don't know what we are doing. Can anybody confirm that such software exists or is this agency just taking us for a ride? If there is such a software, could this not hinder what links the search engines could see? Any comments would be greatly appreciated.
Intermediate & Advanced SEO | | RobSchofield0 -
Title tags seem to be going against Rand and suggestions here, what can I do? Can you help me please?
Hi to all, Ok so I have been beating myself up trying to figure out how to create a good ranking title for my websites homepage. My competition uses commas in its title tags But these are 100% against what Rand and many posts say here. I watched his whiteboard where he says don't use commas. There are posts that say, don't use commas. The biggest one is mortonbuildings(dot)com - followed by - lesterbuildings(dot)com They basically have the same title tags with commas. So what's the catch here? What am I missing? of course these sites have Great Domain Authority, but I swear what they are doing is going against what Rand and many other seo professional's are preaching. I thought it should read better for visitors? But to me it looks like keyword stuffing with a brand name? Any help is greatly appreciated to let me know if these commas and strategies are good to do? or bad news? Again thank you for your time to help a small business. Chris
Intermediate & Advanced SEO | | asbchris0 -
Please help me with your advice
Hi all, Couple years ago I started to build my business based on EMD domain. The intention was to create the source with the rich unique content. After a year of hard work the site achieved top 10 in Google and started to generate good amount of leads. Then Google announced the EMD Update and site lost the 90% of traffic (after Pandas updates our SERP was steady ) “ a new filter that tries to ensure that low-quality sites don’t rise high in Google’s search results simply because they have search terms in their domain names. ” But I don’t consider my site low-quality site, every page, every post is 100% unique and has been created only to share the knowledge with others… The site has EXCELLENT content from industry point of view.... Since the “ EMD Update “ I read hundreds , hundreds of different articles and opinions related to EMD update and finally I am confused and lost. What should I do… • Kill the site and start new one
Intermediate & Advanced SEO | | Webdeal
• Get more links, but what type of links and how I should get them
• Keep hoping and pray....
• Or do something else Please help me with your advice0 -
Blocking HTTP 1.0?
One of my clients believes someone is trying to hack their site. We are seeing the requests with a server protocol or HTTP 1.0 so they want to block 1.0 entirely. Will this cause any problems with search engines or regular, non-spamming visitors?
Intermediate & Advanced SEO | | BryanPhelps-BigLeapWeb0