Site Has Not Recovered (Still!) from Penguin
-
Hello,
I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name.
Thanks for your feedback!
-
hey guys,
Any updates on the ahreflang tests?
I'm in a similar boat - one site got a manual hit in Feb2014...sitewide penalty, at one point brand name was even deindexed.
Got penalty lifted in 5 months. But traffic has not recovered one bit since then.
-
Please keep me posted what you decide to you. As awful as this is, it is nice to know we are not alone. We may just be rebranding and starting from scratch since Google has not provided any indication when they will release the next update. Also, I came across this a couple of weeks ago so it could be several months: http://searchengineland.com/google-we-are-working-on-making-the-penguin-updated-continuously-222247.
-
no need, all hreflang sites are linked in the eyes of Google. Also you still want people going to your original url which is where all your brand building and everything else was done.
At some point Penguin will refresh and your original site will regain the ability to rank. At that point you can decide what you want to do.
I had set mine up so that my co.uk was set to "EN" this capturred all english enquiries. Then in OCT 2014 my site regained its ability to rank and So I switched it so that my co.uk was set to "en-gb". The co.uk extension is an additional signal in google.co.uk for ranking and we are a truly global company with plans to expand into dedicated sites for certain countries. So it was a welcomed find.
However there is no harm in setting up the second site to target your main audience.
Lets say you are a company in the United States with a .com that is waiting for a penguin refresh.Get a second domain, point it to the same directory on your server. Do some clever coding to manage it so that you only have one set of code. (also think of a plan for the future to deliver separate content to both sites, maybe two database tables serving up content based on TLD).
Lets say you get a .netThen apply the following
Do this for each page making sure to keep the same url structure. (John mueller just started the importance of maintaining the same URL structure)
Once your original site regains the ability to rank again from an algo refresh go into webmaster tools and set the region of the new site to your desired specific location such as United States.
Then change the hreflang to:This way you are now telling Google to send only english searches from the US to your site and it will be a localized domain with a higher chance of ranking than the original site.
OR you could at that point remove hreflang and drop the newsite.net
This is the only way I was able to KEEP my original site up maintain my brand and all the people that go direct to my main brand website and also have a way of ranking again.
Its complicated to explain WHY it works but at some point I will write a big article for MOZ on this subject.
Hope that makes things more clear.
-
Yes the domain does not matter.
You can even test it out with just one single page. I spoke to John Mueller about this a while ago, he said when using hreflang you can use it in the same way you use a canonical tag. So maybe you could test it on an internal page that you know should be ranking better than it currently does.
Setup the new domain, create a blank index.php page and just replicate the internal page and URL structure.
FYI, John actually talked about URL structure for hreflang just 2 days ago.
https://www.youtube.com/watch?feature=player_embedded&v=1sewHcbKTJw#t=2171In an ideal world you want to create a clever bit of PHP so that your code is being pulled from one directory otherwise you will have 2 versions of your site that you will need to maintain and that would just be a royal pain in the a$$.
Before you do anything I suggest you read the page below and watch the video by Maile Ohye on there too.
https://support.google.com/webmasters/answer/189077?hl=enFell free to ask me any questions I have had to do this on a few sites and have been doing it for a long time now.
Setup your current site as the x-default and the new one as "EN" so that all english inquiries are served up by Google to your new site.
I look forward to hearing how it goes. FYI its possible you may see your results drop for a day or two and then popup with the old URL again and then the new one, It can take a few days for it to recalibrate. It can also happen right away and then random adjustments happen over the next few weeks.
Also make sure to use fetch as Google in WMT on both site pages to get Googlebot looking at your pages ASAP. I have seen results in 30 seconds before.
-
@Gary, Would this work between a .com & .net website? I'm willing to test anything at this point. Thanks!
-
@ruth
I urge you to try the hreflang solution sometime as a test.
"but it's not something I would test unless you actually do have different English-speaking audiences."
You can Always set the old domain to "x-default" and the new one to "EN" so that all English search results switch to the new site, this is great for sites not willing to wait up to a year for a Penguin refresh or affected by other SERPS suppression. Both sites can be identical and will not cause duplication issues. hreflang is amazing tool for testing.
-
Thanks Ruth for your feedback. I just wanted to address your 2 points above:
-
we actually have been adding all new links to our original disavow file, so we are all set there.
-
yes, we do understand that the loss of links did cause a drop in rankings. Due to this, we've actually started building out natural links throuhh outreach & PR, redesigned the entire site & updated all content as well Along with ongoing contemy creation on site and off.
With those things in mind, is there anything else left? I'm just wondering if we're completely missing something and in getting desperate - is this domain just dead?
And thanks for the heads up on the hreflang, that was a little over my head.
Thanks for your feedback
-
-
A couple of things to keep in mind:
- When disavowing new inbound links, make sure that you're adding them to your existing disavow file - if you just submit the new sites, that new disavow file will overwrite the previous one and un-disavow links.
- A manual penalty is only part of the traffic/ranking loss you'll see with Penguin. Don't forget that you also lost the link value of a bunch of spammy links that previously were providing value. The penalty may be gone, but so are your links! To regain traffic levels, you'll need to build new quality inbound links, so make sure that's a big part of your strategy going forward.
I haven't tried the hreflang solution, so I can't comment on its effectiveness, but it's not something I would test unless you actually do have different English-speaking audiences.
Good luck!
-
Thank you for the lengthy response! I am not sure how to use hreflang but will look into it more this week. As for the your final question, our manual penalty was revoked in Oct. 2013 & it just hasn't performed worth a damn since. We updated all content and are working on a design refresh now in order to support responsive design - hoping that would help as we'd hate to change the domain but we're just at a loss & getting desperate. This is out only client who came to use for this type of cleanup service that has not yet recovered.
-
I have the ultimate answer for you and you will not find this elsewhere online.
I have been through this process and it was a huge pain the a$$
I spoke with John Mueller at Google for years trying to resolve our issues until one day we spoke about hreflang. At which point John said to me that it would be OK to use it. So we played with it and it turned out that we recovered IMMEDIATELY from penguin for a new domain that was hrefland linked from our original site.
So basically this allowed us to keep our original brand name site up while traffic was going to our new co.uk site (in your case .net)
Let say for instance your client is US based. Take the .com site and set it as x-default, then set the new .net site as "en" or if you want to be more specific "en-US".
All Google traffic in english or english from google.com in the US will now start flowing in your non penalised site. the hreflang simply does a swap in those search engines but it happens before the penalty is enforced. So your rankings will return right away depending on where you would now rank after the penalty is lifted and your new content will also be back to their rightful ranking positions. Basically all the suppression is gone.
Dont worry about duplicate content as hreflang handles all that. Its very common for sites to have just a few small changes such as $ to £ or a few spelling changes like color and colour. There is no downside.
If you are clever with the way you code your site you can make it a seamless transition without having to maintain code on two domains. Only took me a few hours to code something up.
The best thing is it opens the door to targeting multiple regions of the world with other languages. (When you are ready).
Hope that answers your question.
Also FYI, when did your manual penalty get revoked? It can take up to a year for your site to be ready for the refresh to consider it OK to lift the suppression. Based on that you may have not been ready for the OCT 2014 refresh and may be waiting for the next one which could be a long way away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our new site will be using static site generator which is supposed to be better for SEO?
Hi folks, Our dev team is planning on building our new marketing webpages on SSG or Static Site Generator(we are stepping away from SSR). Based on my research this is something that can help our SEO in particular for site speed (our site has a poor score).
Intermediate & Advanced SEO | | TyEl
Are there any challenges or concerns I should be aware regarding this direction? If so what are they and how can this be addressed? Thanks0 -
Why does some sites rank with no seo
Why is it that some site rank with zero efforts? I have been working on some seo for a while on my main site and i have been getting more info competition analysis with sem and moz. Looking at the states from this website which tends to popup often in the searches on page 1-2 before my site. This site is not keyword optimized, meaning they arent even trying to rank.
Intermediate & Advanced SEO | | CooperStrzelecki
There is no content, articles etc.,
6 backlinks (nothing powerful just 2 directory links and 2 from developer)
Site really isnt even designed to get traffic as its a trade only ecommerce website
I doubt they are hiding anything as far as backlinks etc. as it will get them too many visitors they dont want
The city i am searching isnt even on the page (it is a city within a city so maybe google still relates it)
PA 24 DA 15 Now my site:
Optimized reasearched keywords
175 backlinks
All my main pages have content with images, alt tags, internal linking
full of content, blogs, videos, products (probably 4000, could a site being too big be an issue?)
Site gets regular updates
I probably have 200 citations
All the social media which gets done often
PA 32 DA 20 They do get a good bit of traffic but that is probably the only thing i would see but it would be direct traffic mostly i believe as it would be people going to order regularly since it is a print reseller. They may have some age on me 15 vs 8 years. Could it be some kind of penalty i am not sure about lingering? According to what i know to check everyything looks ok, no shady links accoding to sem. I am working more and more on all the pages but this competittion site really doesnt have crap going on probably 8 pages and 1 page does all the ordering. What the hell does google want from me exactly!0 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
Linking from a corporate site to a brand site.
Is there an SEO impact to a large corporation linking from a corporate and/or a divisional site to a specific brand site with it's own top level domain? We would like to keep the traffic coming, but not if it will be seen as a black hat tactic. My guess is that Google will be smart enough to see that the corporation owns the brand and at least not penalize us, but I am wondering if anyone else has this experience? Google Analytics is calling it self-referral.
Intermediate & Advanced SEO | | mrbobland0 -
Can you recover from "Unnatural links to your site—impacts links" if you remove them or have they already been discounted?
If Google has already discounted the value of the links and my rankings dropped because in the past these links passed value and now they don't. Is there any reason to remove them? If I do remove them, is there a chance of "recovery" or should I just move forward with my 8 month old blogging/content marketing campaign.
Intermediate & Advanced SEO | | Beastrip0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
SEO for an exponentially growing site?
Hey Mozers! I was having a quick chat with a friend the other day on doing SEO for a site that grows in page size at an exponential rate and was just wondering how you would go about optimizing it? The example that we used would be a site that allowed users to upload videos and then have people vote on two videos against each other. So, if there are 100 uploaded videos and each of them are pared up with the other 99 to create a unique voting/battle page which has it's own unique URL, the site can get very large, VERY quickly. Meaning if just one more video is uploaded there would be How exactly would you go about optimizing the site? My biggest area of confusion would be generating sitemaps. I'm aware of best practices with large sitemaps (i.e. having a sitemap of sitemaps, not going over 50k in entries per sitemap etc..) But, how would you go about creating the sitemaps for this website if it's growing at an exponential rate, if at all? If you have any other questions feel free to ask and I'll clarify it. Thanks! 😃 **TL;DR How would you optimize a site that grows at an exponential rate? **
Intermediate & Advanced SEO | | JordanChoo0 -
Site comparison - what is wrong with me?
www.bcspeakers.com/ vs www.psbspeakers.com/ with the search term "speakers" why does BC speakers show up in around #50-60 and PSB is not in the top #1000? From all metrics on seomoz PSB kicks BC in every area by a large margine! can anyone see why BC is listed for that keyword and PSB is not?
Intermediate & Advanced SEO | | kevin48030