Penguin Rescue! A lead has been hit and I need to save them!
-
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
-
This sounds like a plan. Give it a shot and test the results
-
Does anyone care to share their view on my last?
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
-
Thanks for all the responses guys. I have taken them on-board. 1 thing I have noticed...
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
Thanks
-
I second the time frame issue. 1 month won't be enough time and your work will just benefit the next person this client gets to work on it, while you'll be left with an upset client because of bad expectations.
-
"you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again."
I agree with this 100%.
These types of problems can be fixed and then must wait until google reevaluates and then republishes back into the SERPs. Sites that are hit with these types of problems escape in batches - not when things are fixed.
So, you could do great work, get it fixed on 25th day and then google does not reprocess and republish for 60 more days and some other SEO gets credit for your hard work.
I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo.
Exactly... What are good links? Your "added" links will not be natural.
-
Well, from what everyone is writing about Penguin, it's an algorithmic update. Meaning you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again.
I think the timeline you have set is highly unrealistic and you should aim to set expectations with the client that this process can very well take much longer. If this previous SEO company built problematic links, I think you'll have to deal with them. I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo. I think you're going to have to go through the tedious work of cleaning things up. The good news is that a bunch of people have written about what to look for. Check in WMT tools for sitewide links, check your anchor text pointing into the site. Export your external links from OSE and then upload them to Linkl Detective- http://linkdetective.com/- let it do the hard work for you, classify a lot of the links, and then you need to go through the process of trying to clean things up, doing as much as you can, and then submit a reinclusion request (may help, may not), hoping Google will discard the other links.
Good luck - really try to demonstrate to your client the complexity of the process and extend the timeframe of the project - that's my ultimate recommendation
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Still Need to Write Title & Description Tag?
My SEO has advised me that Google has stopped using title and description tags for search results and as such, it is no longer necessary to write specific title and description tags. I see that Yoast seems to pull text to create these tags and sometimes it looks like it reflects the best elements of the content, sometimes it does not. Should I be asking our SEO team to write dedicated title and description tags or is it best practices to leave it to the Yoast plugin? My SEO is of the opinion that writing these tags is not a productive use of time as Google will serve results based on the user inquiry rather than the content on his tags. It sounds logical but it would be reassuring to receive further confirmation of this. Thoughts?"
Intermediate & Advanced SEO | | Kingalan1
Thanks, Alan0 -
Do href lang tags need to be implemented at blogpost level?
Hey guys, Our site targets multiple territories. We use subfolders and hreflang tags on the site (built in WordPress) at a page level. We've added our hreflang tags manually in the section of each page. We're just re-doing the blog and we want to know if we need to add these tags to each individual blog post and if we do, how we would do it? Our developers have put them in at blog landing page level and told us that this will be fine. E.g.: /de/blog/ /gb/blog/ /uc/blog/ They have a slight tendency to push back on things though, and we just want to be sure we're doing this right. Hreflang tags are sooooo complicated so hoping you fine people can shed some light on the issue. Cheers!
Intermediate & Advanced SEO | | Twetman0 -
Do I need to do a 301, as well as adding re-write rules on Apache
I'm sure this has probably been asked somewhere before... We're implementing a URL re-write rule to convert non dub pages to the www. subdomain and also removing all trailing slashes as part of a basic canonicalisation exercise. The question is, as well as doing the URL rewrites within htaccess, should I also 301 those duplicate pages or does the URL rewrite do the job on it's own? Thanks mozzers.
Intermediate & Advanced SEO | | Ultramod0 -
Local SEO - Do I need it if I don't do business locally?
Super confused about this. Our office is located in Los Angeles, but it is not a storefront, and our clients are from all over the country... and our business involves travel to other countries. So there is nothing "local" about us. But everything I read seems to say we should be doing local SEO. How to approach this?
Intermediate & Advanced SEO | | benenjerry1 -
Need help understanding "Clone sites"
I just read an article about Panda and it warned against against Clone sites: "Clone sites are a strong panda factor (JM, Mar 10, 2014)" I don't have any clone sites, but there are dozens of sites with imitations of mine. We were the first in the area of interest, and then all these other sites that imitated us popped up. None are exact replicas. But many have spun some of our articles and used them to create their sites; the site structures are not identical though. Google seems to know we are the original site on the topic since we are ranked #1 for most terms. Would these be considered clone sites in their eyes?
Intermediate & Advanced SEO | | bizzer0 -
Hit By Penguin...Wait for recovery or do i change domains?
Hey Guys Would very much appreciate all opinions on our following situation, we have an .uk based ecommerce sports nutrition site www.cardiffsportsnutrition.co.uk Previously we worked with an SEO, that to put it simply did not follow webmaster guidelines(money anchor heavy, bad links etc), we reached some very good ranks too quickly and subsequently after the first penguin we where hit. We didn't receive any link warning or manual penalties just what i am assuming algorithmic...Rankings and traffic drop significantly, but not business ending. Since the first penguin we have done very little to no SEO, some unique content, re-writing of product pages, lots of social activity and didnt really lose much traffic after that, some small ups and down after refresh s and a slight slow decline on some keywords. Come Penguin 2.0...things that where still ranking for have now dropped even further, impressions in webmasters is now down over 50% and we have a had a wkly but not drastic drop in traffic since then. Over the last couple of months we have obtained some good quality links, have added lots of great unique content that has been shared significantly and generated some great traffic to our blog, added more unique product pages and category pages. But organically things are starting to look pretty grim apart from our brand keywords and everything is still in a slow decline and no increase in impressions in webmasters either jsut small drops We have been working to remove the poor quality and toxic links that the previous SEO built,getting anchor text corrected and collating information on the whole process ready to submit a file of links to disavow tool. which we are planning to do within the next couple of wks. Now i have read some successful stories and some not so successful one, so im starting to think of how to deal with this worse case scenario, If our domain is too damaged by the previous SEO guys We have the same domain name but on the .com that will help us carry over our brand name directly, but my concern is even though we have not had any manual penalty and not 301'd the .com back to the .co.uk or any other form of link will the penalties be carried over to the new domain just on the basis of brand association. We wouldn't plan to redirect any of the .co.uk traffic back to the .com but rather focus on our already strong ablate less converting traffic from the likes of twitter and facebook and run a small PPC campaign for some brand keyword to help buffer the traffic loss. While we focus on building good quality links and putting up plenty of new quality content on the site on the new domain that does not have any poor quality links back to it. What im trying to avoid is carry on spending time money and effort on the .co.uk domain for the next 3/4 months and continue to lose traffic slowly and then have to switch the domain anyway. I plan to wait and see for the next 4-6wks after we run the dissavow to but October time would be time i would have to make a decision and go for it. Any advice or opinions would be appreciated marc
Intermediate & Advanced SEO | | CSN0 -
Confusing 301 / Canonical Redirect Issue - Wizard Needed
I had two pages on my site with identical content. What I did was 301 redirect one page to the other. I also added canonical redirect code to the page that held the 301 code. Here is what I have: www.careersinmusic.com/music-colleges.aspx - this page was a duplicate and I needed it to resolve to:
Intermediate & Advanced SEO | | 4Buck
www.careersinmusic.com/music-schools.aspx Here is the code I used: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX music-colleges.aspx
<%@ Page Language="VB" AutoEventWireup="false" CodeFile="music-colleges.aspx.vb" Inherits="music_colleges" %>
http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
http://www.w3.org/1999/xhtml"> http://www.careersinmusic.com/music-schools.aspx"/> XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
music-colleges.aspx.vb
Partial Class music_colleges
Inherits System.Web.UI.Page
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Response.Status = "301 Moved Permanently"
Response.AddHeader("Location", "http://www.careersinmusic.com/music-schools.aspx")
End Sub
End Class XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX The problem:
For some reason, when the search “music colleges” is done in Google, I am #7. When the term “music schools” is done, I am around 119. I MUST be getting a penalty for some reason, I just cannot figure the reason. When perform well for one term and terrible for the next? All I can come up with is a duplicate content penalty or something along those lines. Also, music-colleges.aspx seems to still be in Googles index, even though the above 301 happened months ago. Thoughts? site:www.careersinmusic.com/music-colleges.aspx Any insight into this would be GREATLY appreciated. Many Thanks!0 -
Need advice on local search optimization
HI all, I've found myself in a puzzling position and not quite sure which direction to push my current SEO project so if anyone who's done this particular type of SEO can offer some suggestions I'd be eternaly grateful. I am currently working on a project for a Law Firm based in New Jersey. Lets say the town they are in is Garfield. What I really want to try and achieve is see them appearing in the number one spot whenever anyone within Garfield or the immediate area searches for a lawyer relating to the individuals need. E..g searches like "personal injury lawyers", "real estate lawyer". The problem is I can see how I can easily make it to the number one position if people are specific and enter garfield in the search term but in reality they wouldn't be doing that. An additional problem is that peoples ISP's in garfield aren't located in Garfield, in some cases they're as far away as Newark so when they're doing a search for 'real estate lawyer' google is bringing up results for the Newark based firms. It seems using tools like market samurai to look at the traffic and competition is proving useless as searches like the ones I'm doing for local business are so closely tied to the ISP location I don't really know whether to target broad range searches like "Real Estate Lawyer", or to be really specific and include the town name in my page titles, H1 tags etc... I hope I put across my dilemma and someone can help me chose which direction to go in.. Thanks
Intermediate & Advanced SEO | | davebrown19750