Strategy for recovering from Penguin
-
I have a web site that has been hit hard by the penguin update. I believe that main cause our problem has been links from low quality blogs and article sites with overly optimized keyword anchor text. Some questions I have are:
-
I have noticed that we still have good ranking on long tail search terms on pages that did not have unnatural links. This leads me to believe that the penalty is URL specific, i.e. only URL with unnatural linking patterns have been penalized. Is that correct?
-
Are URLs that have been penalized permanently tainted to the point that it is not worth adding content to them and continuing to get quality links to them?
-
Should new contact go on new pages that have no history thus no penalty, or is the age of a previously highly ranked page still of great benefit in ranking?
-
Is it likely that the penalty will go away over time if there are no more unnatural links coming in?
-
-
Depends on what type of sites those "not so great sites" are. If they are viewed as spammy, no they won't help.
-
Would non-optimized links from not so great sites be of any help, or do these need to be quality links?
-
I would start by adding links that are not optimized with keywords to those URL's that you feel are penalized. Level up the score for natural vs unnatural links. I wouldn't drop the url's because you probably have some good links out there pointing to those pages. It'll take some work, but you can recover from what I've been reading.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's Moz's Strategy behind their blog main categories?
I've only just noticed that the Moz' blog categories have been moved within a pull down menu. See it underneath : 'Explore Posts by Category' on any blog page. This means that the whole list of categories under that pull-down is not crawlable by bots, and therefore no link-juice flows down onto those category pages. I imagine that the main drive behind that move is to sculpt page rank so that the business/money pages or areas of the website get greater link equity as opposed to just wasting it all throwing it down to the many categories ? it'd be good to hear about more from Rand or anyone in his team as to how they came onto engineering this and why. One of the things I wonder is: with the sheer amount of content that Moz produces, is it possible to contemplate an effective technical architecture such as that? I know they do a great job at interlinking content from one post onto another, so effectively one can argue that that kind of supersedes the need for hierarchical page rank distribution via categories... but I wonder : "is it working better this way vs having crawlable blog category links on the blog section? have they performed tests" some insights or further info on this from Moz would be very welcome. thanks in advance
Technical SEO | | carralon
David0 -
How to Identify Which Penalty : Penguin, Panda or Other?
I'm in the process of putting together a plan to recover from Algorithmic penalty. I'm not sure if I have to focus my recovery effort based on Penguin, Panda or Other algorithm penalty. After looking at the attached screenshot : Google Analytics Data vs Google Algorithm update timeline, I'm not sure if the blog is affected due to Penguin or Panda. I have following questions Traffic drop is because of Pengin, Panda or Other penalty? (there is no manual penalty message) Where should I focus my time with recovery efforts? (link removal, contents, link building, etc). Any other comments or suggestions? Thanks for you help. cSFZqj7
Technical SEO | | rsmb0 -
Recovering from Blocked Pages Debaucle
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
Technical SEO | | TheCraig0 -
Are gallery sites ok post Penguin?
We're getting ready to re-launch a redesigned site and I was hoping to use the opportunity to get some quality links. Are some of the higher-quality web design gallery sites still ok to submit to? Did Penguin have any effect on these? Just looking for opportunities for a little boost from our re-launch.
Technical SEO | | _JP_0 -
Strategy for retiring brand
Hi guys, I have a client who bought over a fairly well known competitor. They now want to amalgamate the brands and redirect visitors from the competitor's site to their site. There is a healthy volume of organic search queries for the competitor's name. I was thinking about the following strategy: Email all subscribers to let them know about the change Use a meta refresh redirect on the competitor's site to explain about the move After a set period, remove this meta refresh and replace with a 301 redirect Use a canonical tag on the competitor site assigning value to the main brand Set up a landing page on the main site that's optimised for the competitor's brand name and explains about the move. Do you think that this is a good strategy for managing the transition? Thanks!
Technical SEO | | gcdtechnologies0 -
A huge drop in rankings since last 10 days, and not recovered yet.
Hi Mozzers, I have a serious topic to discuss and want help from the experts here. Our website has 6 PR and we have been consistency staying at the top for very competitive terms in the niche. Since last Friday (24th February, 2012) we have been facing massive fluctuation in the rankings for most of the keywords we are focusing on. After this fall, we checked the following details but didn’t find any serious/critical issue that might be contributing towards these fluctuations:- We analyzed Google webmaster tools, there’s no update/warning from Google regarding any negative activity and other things seem to be normal. We checked our website through site search (site: www.domain.com) and found that we haven’t lost any indexed pages and things appear normally as they used to. So, we are sure that we haven’t been banned or penalized. We also cross verified our link building and other promotional activities and we didn’t find anything suspicious that could lead to such a big fluctuation. The drop is really big, some keywords went to 5th or 6th page from top 3 position; some keywords are not in top 200 or 300 spots which were usually staying put between 5th to 10th position. We have analyzed a lot but haven’t come to know the reason why we are facing this fluctuation. Our website is 4 years old and this kind of fluctuation has happened for the first time. Has anyone faced this kind of issue before? I’m looking forward to your support in identifying this trouble. Thanks
Technical SEO | | ValSmith0 -
Google Places - What is the best Service Areas Strategy?
I've found a lot of useful info on this topic in these forums, but still can't seem to find the answer to my specific question. Client has one physical location and services many areas. I have seen various comments that claim setting a service area actually has a negative effect on rankings and the login makes sense to me, so we don't want to do that. Using the actual physical address, seems to be what google would prefer, but the address is actually on the outskirts of the city and would mean that competitors that have addresses closer to the city center would show up before us. Our current places listing has the actual address, but the previous SEO put the larger city, with the smaller city zip on the on the website. City Center: San Diego, 92101 Actual: Street Address, El Cajon, 92020 On website: San Diego, 92020 It this large City + Actual zip code strategy any good? Which of these 3 strategies should we use to standardize all of our listings? *we will not be considering a location or mailbox per service are to use multiple listings at this time
Technical SEO | | vernonmack0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0