Strategy for recovering from Penguin
-
I have a web site that has been hit hard by the penguin update. I believe that main cause our problem has been links from low quality blogs and article sites with overly optimized keyword anchor text. Some questions I have are:
-
I have noticed that we still have good ranking on long tail search terms on pages that did not have unnatural links. This leads me to believe that the penalty is URL specific, i.e. only URL with unnatural linking patterns have been penalized. Is that correct?
-
Are URLs that have been penalized permanently tainted to the point that it is not worth adding content to them and continuing to get quality links to them?
-
Should new contact go on new pages that have no history thus no penalty, or is the age of a previously highly ranked page still of great benefit in ranking?
-
Is it likely that the penalty will go away over time if there are no more unnatural links coming in?
-
-
Depends on what type of sites those "not so great sites" are. If they are viewed as spammy, no they won't help.
-
Would non-optimized links from not so great sites be of any help, or do these need to be quality links?
-
I would start by adding links that are not optimized with keywords to those URL's that you feel are penalized. Level up the score for natural vs unnatural links. I wouldn't drop the url's because you probably have some good links out there pointing to those pages. It'll take some work, but you can recover from what I've been reading.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovered from Penalty But.....
Hello,We recovered from Google's manual penalty in January 2014. Afterwards, we changed our site design, fixed our content, disavowed bad links, increased our social presence, tried to engage customers in blog content, etc. But we still couldn't get our domain name back on SERPs. (It wont show our site even on first page if I search "best vpn service" on Google.com)What should we do to bring our domain name back on the SERP?What About Sandbox? What we need to do to get out our domain name from sand box?Any comments or thoughts will be really appreciated 🙂
Technical SEO | | UmairGadit0 -
Do you think this site has been hit by penguin?
Hi Guys, I need some opinion on a website i am working on www.colourbnners.co.uk They updated their website in August but the company they used did not take into account the URL structure and hence there's a massive loss in links in August time. They also dropped off Google for all their key terms except their brand name 'colour banners'
Technical SEO | | gezzagregz
Since then, they have implemented a 301 redirect. Some key points They have not received any manual warnings in WMT I have disavowed some poor quality links that they have built over the years I am building high quality links quite selectively/slowly There were a lot of duplicate content issues - these have been resolved now.
So my question to you SEO pros is do you think its penguin? or something that i am missing?
If it is penguin, what is the best form of attack to get it removed? regards gezzagrez0 -
Beating big brands for rankings on Google page 1 post Panda & Penguin
Hi all, so having followed lots of SeoMoz guidelines that we have read here and standard SEO ideas we seem to no longer be able to rank for our core keywords.. and certainly not rank in front of the big brands. We're a small eCommerce company and have historically ranked Google positions 1-4 for many of our keywords (a year or two ago)... but now no where near this any more. We always write unique content for our products of usually around 300-400 words per product we include our keywords in Title, meta description and H1 tags. We include buyers guides and set up articles on the site and generally have a reasonable amount of good quality and always uniquely written content Recently we have concentrated to ensure that page load speed is above average and Google Web Master Tools page speed gives us around 80-90 out of 100 We carry out linking and have always done... in the most recent past this has been weighted towards 'content for links' to gain purely incoming links (although in the early days from 2005 we did swap links with other web masters as well as write and publish on article sites etc). product category pages have an intro piece of text that includes the key phrases for that page and is placed as close to the body tag as possible. From what I understand if you are hit by Panda or Penguin the drop off is invariably over night, but we have not seen this... more of a gradual decline over the last year or two (although there was a bit of a downward blip on Panda update 20). Now we're lucky to be on page 2 for what were our main keywords / phrases such as "portable DVD players" or "portable DVD player"... in front of us in every position is a big national brand.. and certainly on page 1 it is purely only a big brand in every postion. They don't have great info from what we can see for these keywords and certainly don't give as much info as we do. For the phrase "portable DVD player" our portable DVD accessories page ranks better than our actual portable DVD player category page... which we also can't understand? This is our portable DVD category page: http://www.3wisemonkeys.co.uk/portable-dvd-players-car Currently we're starting to produce 2 minute product demo videos for as many of our product detail pages as we can and we plan to host these on something such as Vimeo so that content will be unique to our site (rather than YouTube) in order to give us a different format of unique content on many of our product detail pages to improve rankings (and conversion rates as the same time ideally). So ... I am hoping that some one out there can point us in the right direction and shed some light on our declining positions. Are we doing or have done something wrong... or is it in these post Panda / Penguin days extremely difficult for a small business to beat the big brands as Google believes these are what every one wants to see when shopping? Thanks for any comments and / or help.
Technical SEO | | jasef0 -
Penguin update: Penalty caused from onsite issues or link profile?
Back in April before the Penguin update, our website home page ranked in the #1 position for several of our keywords and on page 1 for dozens of other keywords. But immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords. The sharp drop was obviously a penalty of some kind. We worked on removing some bad back links that were questionable. Over the past 7 months many of the bad links have dropped off and our link profile is improving. Our rankings, however, have not improved at all. In Yahoo and Bing we remain strong and rank on page 1 for many of our keywords. I joined SEOmoz because I’ve heard about their great tools and resources for SEO. The first thing I learned is that I had a lot of errors and warnings that need to be addressed and I’m optimistic that these items once addressed will get me out of that dreadful penalty box we’ve been in for 7 months now. So with that quick summary of our SEO problems I have a few questions that I hope to get some direction on. 1. Crawl Diagnostics for my site in SEOmoz reports 7 errors and 19 warnings including missing meta description tags, temporary redirects, duplicate page content, duplicate page title, 4xx client error, and title element too long. Could these errors and warnings be what has landed my website in some kind of penalty or filter? 2. A couple of the errors were duplicate page title and duplicate page content. So there appears to be a duplicate home page. Here are the two pages: howtomaketutus.com/ howtomaketutus.com/?p=home They are the same page but it looks like Google is seeing it as duplicate content. Do I need to do a 301 redirect in the .htaccess file? I’m not sure how that would work since they are the same page. If that is possible how would I go about doing that? 3. Finally based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile? We would really appreciate any help or direction anyone can offer on these issues. Thanks
Technical SEO | | 123craft0 -
Slapped by the Penguin
We had a client's website hit hard by the Penguin update, particularly on the 24th. Sitewide each keyword lost 10-20 positions. It was in #1 or #2 for the past couple years. We optimize all of our websites onpage features well and within the whitehat realm. Since this was the only website affected out of 50+ other sites, I am guessing the penalty came directly from the backlink profile which was quite bad. The client had bought two other directory link package deals about 4 years ago which all of the incoming directory links have the exact same anchor text. I warned him this was completely unnatural and we only went after "natural-looking" links since then. Keep in mind these links were from 4+ years ago and did very little for rankings as we came into the picture. Out of 143 root domain links, around 45 use the same anchor text in link. We started with about 50 links total 2 years ago and have since built a very good quality profile, or so I thought. I was almost certain is was enough various anchor text to dilute it down. I'm wondering if any of your websites that have been hit have a high amount of exact match anchor text. I can't believe Google would penalize just for linkbuilding because it seems to be an easy way to attack competitors but all my data is looking that way. Let me know your thoughts if any of your sites have been hit. Thanks
Technical SEO | | seoninja201 -
Google Places - What is the best Service Areas Strategy?
I've found a lot of useful info on this topic in these forums, but still can't seem to find the answer to my specific question. Client has one physical location and services many areas. I have seen various comments that claim setting a service area actually has a negative effect on rankings and the login makes sense to me, so we don't want to do that. Using the actual physical address, seems to be what google would prefer, but the address is actually on the outskirts of the city and would mean that competitors that have addresses closer to the city center would show up before us. Our current places listing has the actual address, but the previous SEO put the larger city, with the smaller city zip on the on the website. City Center: San Diego, 92101 Actual: Street Address, El Cajon, 92020 On website: San Diego, 92020 It this large City + Actual zip code strategy any good? Which of these 3 strategies should we use to standardize all of our listings? *we will not be considering a location or mailbox per service are to use multiple listings at this time
Technical SEO | | vernonmack0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Is this keyword strategy totally wrong?
I have a Driving School website www.1stclassdriving.co.uk. The site is structured geographically with one page per Area
Technical SEO | | Brian_Worger
(post code) and one page per Driving Instructor. There are links from each Area page to the instructors
working in the Area. The principal search keyword that I want to optimise on is
"Driving Lessons" The thinking was to target each individual Area page for
"Driving lessons in xxx" where xxx is the particular geographic area
and each particular Instructor to "Driving Lessons in yyy" where yyy
is the main town . The ideal would be that a search on "Driving
Lessons" would pick up the root page - search on an area, say "Driving
Lessons in Croydon" would pick up the Croydon area page and a search on a
town, say "Driving lessons in Mitcham" would pick up the Instructor
that covered that town page. However having read Rebeccas Keyword research guide I am
concerned that this strategy is wrong because of the volume of pages that use
"Driving Lessons in xxxx". Does this fall foul of "Keyword cannibalization" ?
and if so what is the best way of being able to achieve our objective?0