Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
-
Hello, really hoped for your feedback here.
So my site is http://www.1stwebdesigner.com
I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014.
Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more.
Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo.
What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years.
Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years.
Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw
-
You're very welcome.
OK, so if we can see that the drop happened May 20, 2014, then there is a high possibility that Panda has affected your site. We can't always say that with certainty though. I remember the week following that date, I had three separate sites that came to me saying that they had been horribly hit by Panda. One of the sites had done a redesign and launched May 19. The redesign changed all of the title tags and the urls, so it was not Panda's fault that they dropped in rankings. Another site had also redesigned and the developer forgot to put the Google analytics tracking code on the site. So, rankings didn't change but it looked like traffic plummeted. And the final site had accidentally noindexed 90% of their site. I'm not kidding!
Still, a big drop on May 20 (give or take one day) means that there are on-site quality issues to address. While it's never wrong to audit your backlinks I'd spend more time reviewing on page factors. When Panda first came out I spent a lot of time looking at thin and duplicate content. While that stuff, especially thin content is still important, there are a LOT more factors that go into Panda. I firmly believe now that Panda is Google's way of figuring out what users prefer seeing. As such, I focus on three things:
-
Improving technical SEO.
-
Removing or vastly improving thin content. (And yes, redirecting to an appropriate page or category page is fine.)
-
Figuring out how the site can be the best of its kind for users. This involves looking at competitors' sites with a non-biased eye and also looking at analytics data to see if there are pages in the index that users are not happy with (i.e. they consistently bounce or spend very little time on the page compared to others.)
Here is more on my approach to Panda:
https://moz.com/blog/have-we-been-wrong-about-panda-all-along
Best of luck!
Marie
-
-
Wow, definitely helps, Marie! Didn't expect to receive a helpful hand from such an SEO expert as you Thank you so much!
First of all, on Webmasters there is no manual action.
Thank you for bringing attention to on-site issues, that's actually what I am doing now, over 8 years we had more than 2000 published articles and right now I am cleaning them out after seeing that 20% of articles receive 80% of the traffic.
By any chance can you suggest the best way to deal with deleted posts? Right now I am redirecting them all with 301 Redirect to related similar post, or if none is suitable I redirect to home page, trying not to leave 404. What would you recommend doing?
Really good points, well, I never syndicated posts, lots of link building included submitting it to related web design news websites (pligg, digg voting type sites or where news appeared in sidebar).
I did exchange blogroll links maybe with 10 max friends sites, but nothing massive.
And regarding May 20, it indeed happened then. Comparing Tuesday, May 20 to Monday, May 19.
Actually after research..and following your disavow article how to work with massive amounts of links in spreadsheets and trying out Detox tool, I found, there are certainly links i would want to disavow, but it's not as significant as I thought. I really think, that what you are saying is right - that simply Google stopped counting some links and that' why rankings did drop.
Action plan is clear at least, first focus on fixing everything possible on-site before touching Disavow tool. Really, really appreciate your helpful reply and taking time to give such an insightful feedback to simple guy like me
Thank you again, Marie!
-
First off, thank you Peter for the recommendations of my articles.
It looks like there are a number of issues with your site. I wouldn't focus so much on just one algorithm, but rather, spend time looking at the entire site and its issues.
What phrases are you trying to rank for? Your title tag on your home page says, "The Community of Web Design Professionals" which I am guessing is not your money keyword. Keywords in the title tag (without stuffing), especially at the front of the tag are very important.
Next, I would get rid of the huge full page popup. It could possibly cause the page layout algo to affect your rankings. It can also cause people to quickly bounce away and look for another site which can be a signal to Google that users don't enjoy your site.
The Google pagespeed insights tool gives you a very low score. Improving on pagespeed could make a significant difference.
You've got 1490 pages in Google's index. Do you really have 1490 high quality pages? Is it possible that there is some content indexed that doesn't need to be? For example, when I take text from this article: http://www.1stwebdesigner.com/find-qualified-designer/ and search for it on quotes on Google I see 27 results. The only pages that you want from your site in Google's index are high quality, unique and compelling articles and resources. With that said, if you wrote this article and other people copied it, I wouldn't be worried. But, if you purposely syndicated it, then it could be an issue.
It's hard to say whether links are the issue. It doesn't seem like any of your drops coincided with a Penguin date so I am guessing that Penguin is not suppressing you. (Can't say for sure without having a deep look). But, it is possible that you were previously getting some benefit from self made links that now, Google can determine are self made as opposed to truly earned and they may be discounting those links. You mentioned that you had created a lot of links on your own. As Google gets better and better at figuring out which links are truly earned as opposed to self made, it may be that you've lost significant pagerank that used to come through these links.
With that said, have you checked Google Search Console --> Search Traffic --> Manual actions to make sure there is no manual action there? I see a lot of blogroll links and sometimes if there has been excessive link exchanging going on a site can get a manual action.
Now, back to the algo stuff. Can you tell via Google analytics if the big drop in May happened on May 20, 2014? If not, it's not Panda. If it is exactly on this date though, then I'd be spending more time looking at on-site issues than links.
Hope that helps!
Marie
-
@Peter - I read everything and was in the process of doing the audit, but Google Sheets keeps crashing, site has around 1 million backlinks and formulas just keep crashing when working with even 100,000 links from only webmaster. Any tips?
-
Thanks Peter, on my way then, excellent help, really appreciate it..will let you know how it goes, if you will be interested
-
So you need to get all links from:
- SearchConsole
- Ahrefs
- Moz OSE
- Majestic - you need to verify your domain and then you can download fresh list for free. For historical you need to paying account.
- WebMeUp - free with some limitations. Paid version also available
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://moz.com/blog/have-we-been-wrong-about-panda-all-alongSo let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://moz.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos. -
-
Thank you so much Peter for your helpful reply. PSD2HTML is a new for me, already read about WPMU crazy moves and disavow guide..started working on it yesterday..just friend of mine who works at Ahrefs still convinced that I shouldnt go Disavow path.
Alright, working on audit then, thank you for links already!
What other tools you would recommend, if you say it isnt enough with Ahrefs?
-
-
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://moz.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.About disavowing. I strongly recommend to read Marie Haynes articles:
https://moz.com/blog/guide-to-googles-disavow-tool
https://moz.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://moz.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://moz.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content question please help
Would content behind a drop down on this site Https://www.homeleisuredirect.com/pool_tables/english_pool_tables/ you have to click the - more about English pool tables text under the video Work just as well for SEO as content on the page like this site http://www.pooltablesonline.co.uk/uk-slate-bed-pool-tables.asp
Intermediate & Advanced SEO | | BobAnderson0 -
Alternative Markup Challenge. Can anyone help?
I have a challenge around alternative markup. We currently operate a single domain with geo-targeted folders and alternative markup implemented. We are now now looking to expand this out to non-English content. Current Implementation; All generic English language content hosted on the main domain, with x5 English language content variations (locales) available under a folder structure (.com/en-us/ etc.). Alternative markup is in place for all locales within the HTML, implemented automatically by developers via the CMS. Locale folders geo-targeted via GWT and Bing WT. Planned Launch; Introduction of 5 new non-English locale folders (e.g. /de-de/ etc.), targeted to their respective country and language. Content language will be mixed, with around 1/10 of pages translated and the other 9/10 of pages (business listings) having their body content remain in English, with headers / footers translated. Locale folders will be geo-targeted via GWT and Bing WT. Folder and markup usage TBC. Options; Folders; Implement folder structure /de/, attempting to indicate country but not language (issue; usually a single identifier indicates language, not country?). Implement /de-de/ folder structure to match the English locales and maintain correct country targeting (issue; some content is not in language). Alternative markup; Do not make use of markup at all. Implement CMS based automated markup on all English and non-English content throughout the locale (e.g. /de-de/), but exclude English language versions (e.g. /en-gb/). Attempt manually implementing markup to bridge the English and non-English locales, potentially creating future issues with new content going live and content being removed. A heavy risk. Current approach is webmaster tools targeting, a /de-de/ folder structure and automated implementation of markup. This means English language URLs will have markup and non-English language URLs will have markup, but they will not match up (e.g. English pages will never have markup for non-English language content). If you minds haven't melted, what's your thoughts? Any help is much appreciated.
Intermediate & Advanced SEO | | HelloAlba0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
Help Identifying Unnatural Links
http://bit.ly/XT8yYYHi,Any help with the below will be most appreciated.We received an unnatural links warning in Webmaster Tools and noticed a large drop in our rankings. We downloaded and carried out a full link audit (3639 links) and logged in an excel spreadsheet with the following status: OK, Have Contacted, Can't Contact, Not SureWe have had some success but the majority of the ones we identified are not contactable.We use the dis-avow tool to tell Google of these. We then submitted a reconsideration request where we explained to Google our efforts and that we can supply them with our audit if necessary by email as you can't upload any evidence.A few days later we received a response suggesting that we still have unnatural links. We are a little stuck as we don't know what they can be:1. Is Google actually looking at our dis-avowed links before making this judgement?2. We have missed something that Google is considering bad but we can't see in our audit?Again we need a little help as we are trying to sort this out but can't see what we are falling down on.I can provide our spreadsheet if necessary.Many ThanksLee
Intermediate & Advanced SEO | | LeeFella0 -
Redirect or Rewrite? 2 pages ranking
We have two pages ranking for "Custom Web Design" http://www.imageworksstudio.com/custom-web-design ranks higher (consistently 9-13) and http://www.imageworksstudio.com/content/custom-web-design ranks around 35-40 the latter is actually an older version of the article that never was replaced or taken down - but it has the majority of the links to it Wondering if we should update the old content so it is not similar to the one that is ranking better or if we should redirect everything to /custom-web-design to see if it can secure better rankings?
Intermediate & Advanced SEO | | imageworks-2612900 -
3 Sites Covering Similar Topics & Panda
My question will take a bit of explaining, so here goes: I have 3 blogs on the same server: 1. personal finance blog; 2. credit card blog; 3. prepaid credit card blog. The personal finance blog is my flagship site started in 2007, which feeds my family and pays the mortgage. By contrast, the other two sites (started in 2008 and 2010) I would gladly kill if the result would help my personal finance blog. In the fall of 2010 (before Panda) the prepaid card blog was penalized by Google. This has been confirmed by Google in response to a reconsideration request. Of course, they don't say why. I've tried a number of things and resubmitted the site, but with no luck. Both the personal finance blog and credit card blog were hit by Panda 2 (April 11, 2011) and have not recovered. While the personal finance site covers many topics (e.g., investing, credit, debt, money management), its income comes largely from credit cards. We review individual credit cards and have pages that list cards by category (e.g., balance transfer, cash back, travel). The credit card blog does the same thing, but of course covers credit cards in more depth. There is a similar overlap between the prepaid card blog on the one hand, and the credit card blog and personal finance blog on the other. However, all content is unique. I do not currently link between the sites, although until a few months ago I had blogroll links between the sites and a few (less than 10) content links. If you've made it this far (and I hope you have), here are my questions: 1. Could the existence of the credit card and prepaid credit card sites be hurting my personal finance blog's rankings in Google, whether via Panda or otherwise? 2. If there is a reasonable chance that the answer to question 1 is yes, what would you suggest I do? Of course, I could just take down the sites, but I wonder if there are other options. One thought I had was to deindex the two card sites (I assume I can do this by disallowing googlebot via robots.txt) and give it time. Would Google treat this as if the sites did not exist? Both sites get a fair amount of traffic from bing and yahoo, so this option appeals to me. Of course, for all I know the existence of the two card sites are hurting my personal finance blog's rankings in bing and yahoo, too. I thought about selling the sites, but if they are hurting my personal finance site, I grow concerned about how google distinguishes between a site being sold and a webmaster just trying to make the sites look like they are owned by different people. In this regard, I've never tried to hide the common ownership of the sites and have no intention of doing that now. If I kill the sites, should I redirect them to my personal finance site? For the penalized prepaid card site, this seems both risky and unhelpful. But perhaps redirecting the credit card site is an option. Given that the personal finance site is my livelihood, I greatly appreciate your thoughts on my dilemma.
Intermediate & Advanced SEO | | Bergerlaw0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0