Penguin Rescue! A lead has been hit and I need to save them!
-
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
-
This sounds like a plan. Give it a shot and test the results
-
Does anyone care to share their view on my last?
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
-
Thanks for all the responses guys. I have taken them on-board. 1 thing I have noticed...
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
Thanks
-
I second the time frame issue. 1 month won't be enough time and your work will just benefit the next person this client gets to work on it, while you'll be left with an upset client because of bad expectations.
-
"you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again."
I agree with this 100%.
These types of problems can be fixed and then must wait until google reevaluates and then republishes back into the SERPs. Sites that are hit with these types of problems escape in batches - not when things are fixed.
So, you could do great work, get it fixed on 25th day and then google does not reprocess and republish for 60 more days and some other SEO gets credit for your hard work.
I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo.
Exactly... What are good links? Your "added" links will not be natural.
-
Well, from what everyone is writing about Penguin, it's an algorithmic update. Meaning you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again.
I think the timeline you have set is highly unrealistic and you should aim to set expectations with the client that this process can very well take much longer. If this previous SEO company built problematic links, I think you'll have to deal with them. I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo. I think you're going to have to go through the tedious work of cleaning things up. The good news is that a bunch of people have written about what to look for. Check in WMT tools for sitewide links, check your anchor text pointing into the site. Export your external links from OSE and then upload them to Linkl Detective- http://linkdetective.com/- let it do the hard work for you, classify a lot of the links, and then you need to go through the process of trying to clean things up, doing as much as you can, and then submit a reinclusion request (may help, may not), hoping Google will discard the other links.
Good luck - really try to demonstrate to your client the complexity of the process and extend the timeframe of the project - that's my ultimate recommendation
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to add the actual language for meta tags and description for different languages? cited for duplicate content for different language
Hi, I am fairly new to SEO and this community so pardon my questions. We recently launched on our drupal site mandarin language version for the entire site. And when i do the crawl site, i get duplicate content for the pages that are in mandarin. Is this a problem or can i ignore this? Should i make different page titles for the different languages? Also, for the metatag and descriptions, would it better in the native language for google to search for? thanks in advance.
Intermediate & Advanced SEO | | lynetteboss0 -
I need to know if the clicks on GSC are unique? Thanks for the answers
I need to know if clicks on GSC are unique or not? thanks
Intermediate & Advanced SEO | | Binary_SEO0 -
[Need advice!] A particular question about a subdomain to subfolder switch
Hello Moz Community! I really was hoping to get your help on a issue that is bothering me for a while now. I know there is a lot of about this topic but I couldn’t find a good answer for my particular question. We are running several web applications that are similar but are also different from each other. Right now, each one has its own subdomain (which was mainly due to technical reasons). Like this: webapp1.rootdomain.com, webapp2.rootdomain.com etc. Our root domain currently points with 301 to webapp1.rootdomain.com. Now, we are thinking about making two changes: changing to a subfolder level like this: rootdomain.com/webapp1 , rootdomain.com/webapp2 etc. Changing our rootdomain to a landing page (lisitng all the apps) and take out the 301 to webapp1 We want to do these changes mainly for SEO reasons. I know that the advantages are not so clear between subdomain/subfolder but we think it could be the right way to go to push the root domain and profit more from juice passing to the different apps. The problem is that we had a bad experience when we first switched from our first wep app (rootdomain.com) to an subdomain (webapp1.rootdomain.com) to set them equal with the other apps. Our traffic dropped a lot and it took us 6 weeks to get back on the same level as before. Maybe it was the 301 not passing all juice or maybe it was the switch to the subdomain. We are not sure. So, I guess my question is do you think it is the right thing to do for web apps to go with subfolders to pass more juice from root to subfolders? Will it bring again huge drops in traffic once we make that change? Is it worth taking that risk or initial drop because it will pay off in the future? Thanks a lot in advance! Your answers would help me a lot.
Intermediate & Advanced SEO | | ummaterial0 -
Website/SEO Audit Needed
We've been outsourcing our link building to India for the past 3 years and the results were pretty good up until beginning of this year. What they were essentially doing is putting links into directories, a few per month, and posting a few articles per month. Out of our top 10 keywords, 8 got into top 10. Then something happened around Jan 1 last year, our ranking started dropping, falling out of the top 50, before settling around 20-30ish. We disavowed most of the low quality links since then. Also, very odd, all the top ranking competitors all fell (including me) and were replaced by less "specialized" companies who sold a broad range of products (for example: all parts of the car, rather than someone who just focused on mufflers). Theres also other differences but again I can't put a finger on it. I'd like to find someone who can do a detailed audit of our site, and our competitors, what happened to cause the drop, and why the new top positions sites are ranked high. And I really don't have time to do an audit myself. Our site is American Hospitality Furniture dot com. Any feed back would be appreciated. Thanks in advance.
Intermediate & Advanced SEO | | AHH8880 -
Penguin 2.0 Recovery - Penguin Update Rerun yet or not
I have been hit by the penguin 2.0 update some five months back. I believe that I have an algorythmic penalty applied to my sites. While the work to cleanup etc has been done, there is certainly no recovery. I also notice a lack of recovery stories. In fact I think anyone affected cannot recover because a recalculation has not happened? Does anyone think that a recalculation of the penguin 2.0 penalties has happened? If so why do they think that.
Intermediate & Advanced SEO | | Jurnii0 -
Need to know best practices of Search Engine Optimization 2013
I want to know best practices of Search Engine Optimization 2013 and also need best possible sources. Thanks
Intermediate & Advanced SEO | | GM0070 -
I need help with a local tax lawyer website that just doesn't get traffic
We've been doing a little bit of linkbuilding and content development for this site on and off for the last year or so: http://www.olsonirstaxattorney.com/ We're trying to rank her for "Denver tax attorney," but in all honesty we just don't have the budget to hit the first page for that term, so it doesn't surprise me that we're invisible. However, my problem is that the site gets almost NO traffic. There are days when Google doesn't send more than 2-3 visitors (yikes). Every site in our portfolio gets at least a few hundred visits a month, so I'm thinking that I'm missing something really obvious on this site. I would expect that we'd get some type of traffic considering the amount of content the site has, (about 100 pages of unique content, give or take) and some of the basic linkbuilding work we've done (we just got an infographic published to a few decent quality sites, including a nice placement on the lawyer.com blog). However, we're still getting almost no organic traffic from Google or Bing. Any ideas as to why? GWMT doesn't show a penalty, doesn't identify any site health issues, etc. Other notes: Unbeknownst to me, the client had cut and pasted IRS newsletters as blog posts. I found out about all this duplicate content last November, and we added "noindex" tags to all of those duplicated pages. The site has never been carefully maintained by the client. She's very busy, so adding content has never been a priority, and we don't have a lot of budget to justify blogging on a regular basis AND doing some of the linkbuilding work we've done (guest posts and infographic).
Intermediate & Advanced SEO | | JasonLancaster0 -
Need advice on local search optimization
HI all, I've found myself in a puzzling position and not quite sure which direction to push my current SEO project so if anyone who's done this particular type of SEO can offer some suggestions I'd be eternaly grateful. I am currently working on a project for a Law Firm based in New Jersey. Lets say the town they are in is Garfield. What I really want to try and achieve is see them appearing in the number one spot whenever anyone within Garfield or the immediate area searches for a lawyer relating to the individuals need. E..g searches like "personal injury lawyers", "real estate lawyer". The problem is I can see how I can easily make it to the number one position if people are specific and enter garfield in the search term but in reality they wouldn't be doing that. An additional problem is that peoples ISP's in garfield aren't located in Garfield, in some cases they're as far away as Newark so when they're doing a search for 'real estate lawyer' google is bringing up results for the Newark based firms. It seems using tools like market samurai to look at the traffic and competition is proving useless as searches like the ones I'm doing for local business are so closely tied to the ISP location I don't really know whether to target broad range searches like "Real Estate Lawyer", or to be really specific and include the town name in my page titles, H1 tags etc... I hope I put across my dilemma and someone can help me chose which direction to go in.. Thanks
Intermediate & Advanced SEO | | davebrown19750