Strategy for recovering from Penguin
-
I have a web site that has been hit hard by the penguin update. I believe that main cause our problem has been links from low quality blogs and article sites with overly optimized keyword anchor text. Some questions I have are:
-
I have noticed that we still have good ranking on long tail search terms on pages that did not have unnatural links. This leads me to believe that the penalty is URL specific, i.e. only URL with unnatural linking patterns have been penalized. Is that correct?
-
Are URLs that have been penalized permanently tainted to the point that it is not worth adding content to them and continuing to get quality links to them?
-
Should new contact go on new pages that have no history thus no penalty, or is the age of a previously highly ranked page still of great benefit in ranking?
-
Is it likely that the penalty will go away over time if there are no more unnatural links coming in?
-
-
Depends on what type of sites those "not so great sites" are. If they are viewed as spammy, no they won't help.
-
Would non-optimized links from not so great sites be of any help, or do these need to be quality links?
-
I would start by adding links that are not optimized with keywords to those URL's that you feel are penalized. Level up the score for natural vs unnatural links. I wouldn't drop the url's because you probably have some good links out there pointing to those pages. It'll take some work, but you can recover from what I've been reading.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Correct Internal Linking Strategy
Hello. So my website currently has 8 pages in total. (Homepage, 5 Service Pages, Contact, About). I currently have about 80 quality RD and my Homepage already ranks #8 for my main keyword, while all Service Pages (P1, P2, P3, P4, P5) are stuck somewhere at #30-60 positions for their target keywords. My internal linking scheme looks like this https://i.imgur.com/2cA529v.png. The Homepage has a sidebar with links to all Service Pages, and each Service Page has the same sidebar that links to each Service Page, but doesn't link back to my Homepage. Contact and About pages can be accessed only via the links in the menu. I don't have any contextual links on my website, so all pages that are important for SEO are linked only via this same very sidebar. All these Service Pages are equally valuable to me, but they don't seem to grow much in Google. The Onpage Score of these pages is better than those of TOP10 competitors, and my content provides more value (I used the Skyscraper Technique). Taking all that into consideration, can you please tell me what might be wrong? Why Should I build more quality backlinks to these service pages instead of the homepage? Should I add contextual links to all my service pages from the homepage? Does my internal linking strategy look good to you? If not, what should I change? Can I hit top #10 with my internal pages for their target keywords if I mainly build links only to my homepage? All keywords that I'm after have low to medium competition. My website has 90 RD in total, and my website's DA is 27. Thank you. 2cA529v.png
Technical SEO | | NathalieBr3 -
Need 301 Advice with a Recovered URL from a Domain Typosquatter
I am new to a SMB and someone bought the plural version of our domain back in 2005 and has yet to let it expire. The domain was just renewed for another year so we finally decided to contact a lawyer and go through the domain name dispute process. This seems like a pretty cut an dry case and the lawyer is very confident that we'll have the domain within 30-40 days. Currently the plural version domain 303s to spammy web pages, shows shady ads and is just a malicious looking page in general. I am not savvy enough to know the exact complexities of what's happening on the backend but it's spammy. Knowing the history of the plural version domain, how would you treat it after we acquire it? Obviously, I wouldn't want to put our site in jeopardy by 301ing the plural version of our URL to our current healthy site but at the same time many customers might go to that domain by accident so eventually I'd like to 301 it. If it's any help, the plural version has a robots.txt that prevent G from crawling it..thank you in advance for your guidance!
Technical SEO | | ssimarketing0 -
How to recover search volume after domain name change?
On the 3rd of November we changed our company name and domain. The new site was not changed at all so the 301 process was quite straightforward. The change over was successful, no downtime, all pages redirected correctly (with a few minor exceptions). However, after a few days we started to see more and more links into the new site from the old site. They now stand at over 3 million. And links from the new site to the old site of over 200K. Links from the new site back to the old, were due to us having left a lot of links tucked away on various pages which were possibly causing loops with the 301 redirects on the old site. We fixed these and now there are no remaining links back to the old site, though we are still showing just over 200K links back to the old site. We are also seeing a LOT more back-links on the new site from old junk sites, which are not showing for the old site. A couple of years ago we went through about a year of trying to track down and remove thousands of spam backlinks. We did what we could, got a lot removed, showed Google the evidence, then Google lifted the penalty and said they had made some changes that meant the links were no longer causing the penalty. I added the old disavow file to the new site, but it doesn't cover a fraction of the sites which are being displayed as providing backlinks... many of which are clearly spammy. Is it possible that Google made some manual actions to lift the penalties but failed to associate these changes with the new domain? Changes that were not included in the disavow file? All help appreciated.
Technical SEO | | Exotissimo0 -
Can Silos and Exact Anchor Text In Links Hurt a Site Post Penguin?
Just got a client whose site dropped from a PR of 3 to zero. This happened shortly after the Penguin release, June, 2012. Examining the site, I couldn't find any significant duplicate content, and where I did find duplicate content (9%), a closer look revealed that the duplication was totally coincidental (common expressions). Looking deeper, I found no sign of purchased links or linking patterns that would hint at link schemes, no changes to site structure, no change of hosting environment or IP address. I also looked at other factors, too many to mention here, and found no evidence of black hat tactics or techniques. The site is structured in silos, "services", "about" and "blog". All page titles that fall under services are categorized (silo) under "services", all blog entries are categorized under "blogs", and all pages with company related information are categorized under "about". When exploring the site's links in Site Explorer (SE), I noticed that SE is identifying the "silo" section of links (i.e. services, about, blog, etc.) and labeling it as an anchor text. For example, domain.com/(services)/page-title, where the page title prefix (silo), "/services/", is labeled as an anchor text. The same is true for "blog" and "about". BTW, each silo has its own navigational menu appearing specifically for the content type it represents. Overall, though there's plenty of room for improvement, the site is structured logically. My question is, if Site Explorer is picking up the silo (services) and identifying it as an anchor text, is Google doing the same? That would mean that out of the 15 types of service offerings, all 15 links would show as having the same exact anchor text (services). Can this type of site structure (silo) hurt a website post Penguin?
Technical SEO | | UplinkSpyder0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Google Reconsideration Request (Penguin) - Will Google give links to remove?
When Penguin v1 hit, our site took a hit for a single phrase (i.e. "widgets") due to the techniques our SEO company was using (network). We've since had those links cleaned up, and our rankings have not recovered. Our SEO company said they submitted a reconsideration request on our behalf, and that Google denied it and didn't provide which links we needed removed. Does Google list links that need removing if they are still not happy with your link profile?
Technical SEO | | crucialx0 -
Anyone else seeing increased duplication of domains since Penguin?
Hi Is it just me or are the Google SERPs showing more duplication of domains since the penguin update. As an example if I search for "business Christmas cards" on google.co.uk then results 2, 3 and 17 are from the same domain. Similarly results 4, 20, 21 and 22 are the same domain. All results are "reasonable" in that they are designed to catch traffic for variations on this term BUT I'm sure google used to filter this duplication per-penguin. Am I imagining this increased duplication of domains? Gary
Technical SEO | | gtrotter6660 -
Best strategy for category filtering links eg by colour
Hi All, I hope you can help with some basic on page seo questions! I have an ecommerce site which allows users to filter/restrict the view of a category by one or more colours. This is done by appending a querystring value to the url ie to view blue, green and purple widgets the link might be: www.example.com/my-widgets-category/?colors=123,92,64 On each category page is a group of coloured boxes with links to filter by that colour, (only if there are available coloured widgets in that category). Each category has rel=canonical set to be the appropriate unfiltered category url ie: www.example.com/my-widgets-category/ I used to have these colour filter links all nofollowed- but am not sure that this is a good idea. So my questions are: 1/ what are the implications of these colour links that can generate a lot of different urls (as you can keep on adding colours to the filter) and how can i enure that i am not shooting myself in the foot- my customers love it! 2/ I also have page=1 etc appended for paging through results- the canonical url is set in all instnaces to be the plain category page as above- do i need to add the rel=prev and re=next? 3/ all of these links can really bump up my total page link count- at the moment i have colour filtering boxes in my main menu drop downs so that users can filter all the products that exists in all of the nested child categories of top level categories by colour. Should i remove these to reduce my total link count, nofollow them or leave as is? Its a great site feature for users- i just don't want to be shooting myself in the foot unecessarily. Thanks!
Technical SEO | | blessig0