Does Google throttle back the search performance of a penalised website/page after the penalty has been removed?
-
Hi Mozzers.
Back in 2013 my website www.octopus-hr.co.uk was hit by a Penguin 2.0 penalty owing to a harmful backlink profile built by a dodgy SEO consultant (now fired). The penalty seemed to apply to the homepage of the site but other pages were unaffected.
We got what links we could removed, disavowed the rest and were informed in September 2013 that the penalty had been removed and our re-inclusion request had been successful. However our website homepage still ranks poorly for the search terms we're targeting in the UK: "HR Software" "HR Systems"
On page factors are in my opinion pretty well optimised for these search terms. In terms of link building post penalty we've focused on high authority and relevant sites. I believe that compared to most of our search competitors the back link profile to our homepage is in pretty good shape, however it still ranks badly.
Has anyone had any experience of a penalty hangover from Google in the past? Are there other things I should consider?
Thanks
David
-
Remember that a lot of links you had are now no longer, removed or disavowed you cant expect to rank the same.
Also how long has it been since you had penalty lifted. It may take a while for all things to equal up again. -
If reconsideration worked and you got a reply from Google, it's likely that you were facing a manual penalty (either instead of or in addition to Penguin). So, it may be that Penguin or some other algorithmic penalty is in play (echoing what Andy said).
Once a penalty expires or is lifted, I'm unaware of any kind of dampening on the site (like, 50% penalty for 3 months and then 25%, etc.). This is much more likely to be a situation where you have multiple layers of problems (some could be technical, etc., and not penalties) and you've removed just the top layer.
-
if you drop me a quick mail over David (address above) I can give you a little more detail. Wouldn't be something I could do here.
-Andy
-
Thanks Andy for your kind offer, if you're happy to have a quick look at our link profile your feedback would be very much appreciated.
-
Hi David,
I can run a quick scan for you and tell you what sort of shape your link profile is in, but what 'could' have happened is that since the penalty, another algorithm has come along and hit you for something else. A little awkward to guess exactly though, but have seen this happen on a number of occasions.
Edit-- OK, your link profile isn't too healthy I'm afraid David. One even listed as Malicious... If you want to drop me a mail over, I will give you a little more info info@inetseo.co.uk
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Google is not indexing an updated website
We just relaunched a website that has 5 years old, we maintain all the old URLs and articles but for some reason google is not picking up the new website https://www.navisyachts.com. In Google Webmaster Tools we can see the sitemap with over 1000 pages submitted but shows nothing as indexed. The site is loosing traffic rapidly and positions, from the SEO side all looks fine for me. What can be wrong? I’ll appreciate any help. The new website is built over Joomla 3.4, we have it here at MOZ and other than some minor details it doesn't show that something can be wrong with the website. Thank you.
Intermediate & Advanced SEO | | FWC_SEO0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
New Website Look/Structure - Should I Redirect or Update Pages w/ Quality Inbound Links
This questing is regarding an ecommerce website that I hand wrote(html) in 1997. One of the first click and buy websites, with cart/admin system that I also developed. After all this time, the Old plain HTML look just doesnt cut it. I just updated to XHTML w/ a very modern look, and believe the structured data will index better. All products and current category pages will have the identical vrls taken from the old version. I decided to go with the switch after manual penalty, which has since been removed... I figured now is the time to update. My big question is that over the years, a lot of my backlinks came from products/news that are either no longer relevant or just not available. The pages do exist, but can only be found from the Outbound Link Source. For SEO purposes, I have thought a few things I can do but can't decide which one is the best choice. Any Insight or suggestions would be Awesome! 1. Redirect the old link to the most relevant page in my current catalog. 2. Add my new header/footer to old page(this will add a navigation bar w/ brands/cats/etc) 3. Simply add a nice new image to the top of these pages linking home & update any broken/irrelevant links. I was also considering adding just the very top 2 inches of my header(logo,search box, phone, address) *note, some of these pages do receive some traffic. Nothing huge, but consider the 50+ pages, it ads up.
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
Google is ranking the wrong page for the targeted keyword
I have two examples below where we want it to rank for the targeted page but google picked another page to rank instead. This is happening a lot on this site I just recently started to work on. Example 1 Googles Choice for key word Motorcycle Tires: http://www.rockymountainatvmc.com/cl/50/Tires-and-Wheels What we want Google to choice for Motorcycle Tires: http://www.rockymountainatvmc.com/c/49/-/181/Motorcycle-Tires Other pages about Motorcycle tires: http://www.rockymountainatvmc.com/d/12/Motorcycle-Tires We even used the rel="canonical" for this url to point to our target page. http://www.rockymountainatvmc.com/c/50/-/181/Motorcycle-Tires Example 2 ATV Tires We want this page to rank http://www.rockymountainatvmc.com/c/43/81/165/ATV-Tires however google has decided to rank http://www.rockymountainatvmc.com/t/43/81/165/723/ATV-Tires-All that is acutally one folder under where we want it.
Intermediate & Advanced SEO | | DoRM0