Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
-
Hello, really hoped for your feedback here.
So my site is http://www.1stwebdesigner.com
I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014.
Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more.
Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo.
What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years.
Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years.
Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw
-
You're very welcome.
OK, so if we can see that the drop happened May 20, 2014, then there is a high possibility that Panda has affected your site. We can't always say that with certainty though. I remember the week following that date, I had three separate sites that came to me saying that they had been horribly hit by Panda. One of the sites had done a redesign and launched May 19. The redesign changed all of the title tags and the urls, so it was not Panda's fault that they dropped in rankings. Another site had also redesigned and the developer forgot to put the Google analytics tracking code on the site. So, rankings didn't change but it looked like traffic plummeted. And the final site had accidentally noindexed 90% of their site. I'm not kidding!
Still, a big drop on May 20 (give or take one day) means that there are on-site quality issues to address. While it's never wrong to audit your backlinks I'd spend more time reviewing on page factors. When Panda first came out I spent a lot of time looking at thin and duplicate content. While that stuff, especially thin content is still important, there are a LOT more factors that go into Panda. I firmly believe now that Panda is Google's way of figuring out what users prefer seeing. As such, I focus on three things:
-
Improving technical SEO.
-
Removing or vastly improving thin content. (And yes, redirecting to an appropriate page or category page is fine.)
-
Figuring out how the site can be the best of its kind for users. This involves looking at competitors' sites with a non-biased eye and also looking at analytics data to see if there are pages in the index that users are not happy with (i.e. they consistently bounce or spend very little time on the page compared to others.)
Here is more on my approach to Panda:
https://moz.com/blog/have-we-been-wrong-about-panda-all-along
Best of luck!
Marie
-
-
Wow, definitely helps, Marie! Didn't expect to receive a helpful hand from such an SEO expert as you
Thank you so much!
First of all, on Webmasters there is no manual action.
Thank you for bringing attention to on-site issues, that's actually what I am doing now, over 8 years we had more than 2000 published articles and right now I am cleaning them out after seeing that 20% of articles receive 80% of the traffic.
By any chance can you suggest the best way to deal with deleted posts? Right now I am redirecting them all with 301 Redirect to related similar post, or if none is suitable I redirect to home page, trying not to leave 404. What would you recommend doing?
Really good points, well, I never syndicated posts, lots of link building included submitting it to related web design news websites (pligg, digg voting type sites or where news appeared in sidebar).
I did exchange blogroll links maybe with 10 max friends sites, but nothing massive.
And regarding May 20, it indeed happened then. Comparing Tuesday, May 20 to Monday, May 19.
Actually after research..and following your disavow article how to work with massive amounts of links in spreadsheets and trying out Detox tool, I found, there are certainly links i would want to disavow, but it's not as significant as I thought. I really think, that what you are saying is right - that simply Google stopped counting some links and that' why rankings did drop.
Action plan is clear at least, first focus on fixing everything possible on-site before touching Disavow tool. Really, really appreciate your helpful reply and taking time to give such an insightful feedback to simple guy like me
Thank you again, Marie!
-
First off, thank you Peter for the recommendations of my articles.
It looks like there are a number of issues with your site. I wouldn't focus so much on just one algorithm, but rather, spend time looking at the entire site and its issues.
What phrases are you trying to rank for? Your title tag on your home page says, "The Community of Web Design Professionals" which I am guessing is not your money keyword. Keywords in the title tag (without stuffing), especially at the front of the tag are very important.
Next, I would get rid of the huge full page popup. It could possibly cause the page layout algo to affect your rankings. It can also cause people to quickly bounce away and look for another site which can be a signal to Google that users don't enjoy your site.
The Google pagespeed insights tool gives you a very low score. Improving on pagespeed could make a significant difference.
You've got 1490 pages in Google's index. Do you really have 1490 high quality pages? Is it possible that there is some content indexed that doesn't need to be? For example, when I take text from this article: http://www.1stwebdesigner.com/find-qualified-designer/ and search for it on quotes on Google I see 27 results. The only pages that you want from your site in Google's index are high quality, unique and compelling articles and resources. With that said, if you wrote this article and other people copied it, I wouldn't be worried. But, if you purposely syndicated it, then it could be an issue.
It's hard to say whether links are the issue. It doesn't seem like any of your drops coincided with a Penguin date so I am guessing that Penguin is not suppressing you. (Can't say for sure without having a deep look). But, it is possible that you were previously getting some benefit from self made links that now, Google can determine are self made as opposed to truly earned and they may be discounting those links. You mentioned that you had created a lot of links on your own. As Google gets better and better at figuring out which links are truly earned as opposed to self made, it may be that you've lost significant pagerank that used to come through these links.
With that said, have you checked Google Search Console --> Search Traffic --> Manual actions to make sure there is no manual action there? I see a lot of blogroll links and sometimes if there has been excessive link exchanging going on a site can get a manual action.
Now, back to the algo stuff. Can you tell via Google analytics if the big drop in May happened on May 20, 2014? If not, it's not Panda. If it is exactly on this date though, then I'd be spending more time looking at on-site issues than links.
Hope that helps!
Marie
-
@Peter - I read everything and was in the process of doing the audit, but Google Sheets keeps crashing, site has around 1 million backlinks and formulas just keep crashing when working with even 100,000 links from only webmaster. Any tips?
-
Thanks Peter, on my way then, excellent help, really appreciate it..will let you know how it goes, if you will be interested
-
So you need to get all links from:
- SearchConsole
- Ahrefs
- Moz OSE
- Majestic - you need to verify your domain and then you can download fresh list for free. For historical you need to paying account.
- WebMeUp - free with some limitations. Paid version also available
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://moz.com/blog/have-we-been-wrong-about-panda-all-alongSo let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://moz.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos. -
-
Thank you so much Peter for your helpful reply. PSD2HTML is a new for me, already read about WPMU crazy moves and disavow guide..started working on it yesterday..just friend of mine who works at Ahrefs still convinced that I shouldnt go Disavow path.
Alright, working on audit then, thank you for links already!
What other tools you would recommend, if you say it isnt enough with Ahrefs?
-
-
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://moz.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.About disavowing. I strongly recommend to read Marie Haynes articles:
https://moz.com/blog/guide-to-googles-disavow-tool
https://moz.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://moz.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://moz.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Hi I know this is cheeky but you are all so helpful on here!
hi, quick question, I've made a new instillation of wordpress at sussexchef.com/dev and I'm about to start building pages, obvoisly I'm going to move it to sussexchef.com when its all looking right. when I choose my page address links/ permalinks thingy, should I use new url names that don't already exist on the old site? or should I keep the old url names so I don't get loads of 404's, but include the "dev/" in the url name? Eg the old address sussexchef.com/home should I use sussexchef.com/dev/home or sussexchef.com/home-sussex-caterers while building the development site? I'm guessing the later my help out in google searches too? But if I use Dev in the url shurly I will have to go through almost 100 pages removing the dev/ and also changing all the links too? This would be days of work!
Intermediate & Advanced SEO | | SussexChef83
So confused! I'd really appreciate your help here. Ben0 -
Do I have a Panda filter on a specific segment?
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages! Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links. That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0). We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model. Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
Intermediate & Advanced SEO | | Matt.Carwow0 -
Help with https// redirects
Hey there
Intermediate & Advanced SEO | | Jay328
I have a client who just moved from a self hosted CMS to Adobe Catalyst (don't ask!)
The problem: Their url indexed with google is https://domain.com, Adobe Catalyst does not support third party SSL certificates or https domains. Now when people google them https://domain.com shows up in search, HOWEVER it does not have a trusted certificate and a pop up window blocks the site. They are a mortgage company so SSL is really not needed. What can I do to get google to recognize the site at http: vs. https? Would this be something in GWMT? Thanks!0 -
Help needed for a domain
I have a small translation agency in Brazil (this website), totally dependent on SEM. We are in business since 2007, and we were on top position for many relevant keywords until the middle of 2011, when the ranking for the most important keywords started dropping. In that time, we believed that we needed to redesign the old static website and replace it by a new modern one, with fresh content and with weekly updates, which we did, and it's now hosted on Squarespace. I took care to keep the old links working with 301 redirections. When we made the transfer from the static site to Squarespace (Mar/2012, see the attachment), the ranking dropping became even more serious. Today, we have less than 50 unique visitors per day, in a total desperate situation! To make things worse, we received an alert from Google on 23/September/2012 talking about unnatural inbound links, but Google said that "As a result, for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole", so we thought we didn't need to worry about. Google was correct, I worked many hours to register our website in web directories, I thought there would be no problem since I was doing this manually. My conclusions are: Something happened prior to Mar/2012 that was making us losing territory. I just don't know what! The migration to Squarespace was a huge mistake. I lost control over the html, and squarespace doesn't do a good job optimizing the pages for SEO. We also were also blasted by Penguin on September, but I believe this is not the main cause of the drop. We were already running very badly at this time. My actions are: a) I generated a DTOX report and I'm trying to clean up the links marked as toxic. That's a hard work! After that I will submit a reconsideration request. b) I'm working on the site: Improving internal link building for relevant keywords Recently I removed a "tag cloud" which I believe was hurting my SEO. Also, I did some redirections that were missing. c) I trying to generate new content to improve link building to my site. d) I'm also considering to stop putting all my coins on this domain, and maybe start a fresh new one. Yes, I'm desperate! 🙂 I would appreciate a lot to hear from you guys, expert people! Thanks a lot, MWcEdPa.png?1
Intermediate & Advanced SEO | | rodrigofreitas0 -
Has there been a 'Panda' update in the UK?
My site in the UK suddenly dropped from page 1 and out of top 50 for all KWs using 'recliner' or a derivative. We are a recliner manufacturer and have gained rank over 15 years, and of course using all white hat tactics. Did Google make an algo update in the Uk last week?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
International IP redirection - help please!
Hi, We have a new client who has built a brand in the UK on a xyz.com domain. The "xyz.com" is now a brand and features on all marketing. Lots of SEO work has taken place and the UK site has good rankings and traffic. They have now expanded to the US and with offline marketing leading the way, xyz.com is the brand being pushed in the US. So with the launch of the offline marketing US IP's are now redirected to a US version of the site (subfolder) with relevant pricing and messaging. This is great for users, but with Googlebot being on a US IP it is also being redirected and the UK pages have now dropped out of the index. The solution we need would ideally have both UK and US users searching for xyz.com, but would see them land on respective static pages with correct prices. Ideally no link authority would be moved via redirection of users. We have considered the following solutions Move UK site to subfolder /uk and redirect UK ips to this subfolder (and so not googlebot) downside of this is it will massively impact the UK rankings which are the core driver of the business - also would this be deemed as illegal cloaking? natural links will always be to the xyz.com page and so longer term the US homepage will gain authority and UK homepage will be more reliant on artificial linkbuilding. Use a overlay that detects IP address and requests users to select relevant country (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Use a homepage with country selection (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Is there an easy solution to this problem that we're overlooking? Is there another way of legal cloaking we could use here? Many thanks in advance for any help here
Intermediate & Advanced SEO | | Red_Mud_Rookie0 -
Duplicate Content across 4 domains
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical. There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain. I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain. What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc. Or would it better to use cross-domain canoncial tags? Thanks
Intermediate & Advanced SEO | | bjalc20110