Site Has Not Recovered (Still!) from Penguin
-
Hello,
I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name.
Thanks for your feedback!
-
hey guys,
Any updates on the ahreflang tests?
I'm in a similar boat - one site got a manual hit in Feb2014...sitewide penalty, at one point brand name was even deindexed.
Got penalty lifted in 5 months. But traffic has not recovered one bit since then.
-
Please keep me posted what you decide to you. As awful as this is, it is nice to know we are not alone. We may just be rebranding and starting from scratch since Google has not provided any indication when they will release the next update. Also, I came across this a couple of weeks ago so it could be several months: http://searchengineland.com/google-we-are-working-on-making-the-penguin-updated-continuously-222247.
-
no need, all hreflang sites are linked in the eyes of Google. Also you still want people going to your original url which is where all your brand building and everything else was done.
At some point Penguin will refresh and your original site will regain the ability to rank. At that point you can decide what you want to do.
I had set mine up so that my co.uk was set to "EN" this capturred all english enquiries. Then in OCT 2014 my site regained its ability to rank and So I switched it so that my co.uk was set to "en-gb". The co.uk extension is an additional signal in google.co.uk for ranking and we are a truly global company with plans to expand into dedicated sites for certain countries. So it was a welcomed find.
However there is no harm in setting up the second site to target your main audience.
Lets say you are a company in the United States with a .com that is waiting for a penguin refresh.Get a second domain, point it to the same directory on your server. Do some clever coding to manage it so that you only have one set of code. (also think of a plan for the future to deliver separate content to both sites, maybe two database tables serving up content based on TLD).
Lets say you get a .netThen apply the following
Do this for each page making sure to keep the same url structure. (John mueller just started the importance of maintaining the same URL structure)
Once your original site regains the ability to rank again from an algo refresh go into webmaster tools and set the region of the new site to your desired specific location such as United States.
Then change the hreflang to:This way you are now telling Google to send only english searches from the US to your site and it will be a localized domain with a higher chance of ranking than the original site.
OR you could at that point remove hreflang and drop the newsite.net
This is the only way I was able to KEEP my original site up maintain my brand and all the people that go direct to my main brand website and also have a way of ranking again.
Its complicated to explain WHY it works but at some point I will write a big article for MOZ on this subject.
Hope that makes things more clear.
-
Yes the domain does not matter.
You can even test it out with just one single page. I spoke to John Mueller about this a while ago, he said when using hreflang you can use it in the same way you use a canonical tag. So maybe you could test it on an internal page that you know should be ranking better than it currently does.
Setup the new domain, create a blank index.php page and just replicate the internal page and URL structure.
FYI, John actually talked about URL structure for hreflang just 2 days ago.
https://www.youtube.com/watch?feature=player_embedded&v=1sewHcbKTJw#t=2171In an ideal world you want to create a clever bit of PHP so that your code is being pulled from one directory otherwise you will have 2 versions of your site that you will need to maintain and that would just be a royal pain in the a$$.
Before you do anything I suggest you read the page below and watch the video by Maile Ohye on there too.
https://support.google.com/webmasters/answer/189077?hl=enFell free to ask me any questions I have had to do this on a few sites and have been doing it for a long time now.
Setup your current site as the x-default and the new one as "EN" so that all english inquiries are served up by Google to your new site.
I look forward to hearing how it goes. FYI its possible you may see your results drop for a day or two and then popup with the old URL again and then the new one, It can take a few days for it to recalibrate. It can also happen right away and then random adjustments happen over the next few weeks.
Also make sure to use fetch as Google in WMT on both site pages to get Googlebot looking at your pages ASAP. I have seen results in 30 seconds before.
-
@Gary, Would this work between a .com & .net website? I'm willing to test anything at this point. Thanks!
-
@ruth
I urge you to try the hreflang solution sometime as a test.
"but it's not something I would test unless you actually do have different English-speaking audiences."
You can Always set the old domain to "x-default" and the new one to "EN" so that all English search results switch to the new site, this is great for sites not willing to wait up to a year for a Penguin refresh or affected by other SERPS suppression. Both sites can be identical and will not cause duplication issues. hreflang is amazing tool for testing.
-
Thanks Ruth for your feedback. I just wanted to address your 2 points above:
-
we actually have been adding all new links to our original disavow file, so we are all set there.
-
yes, we do understand that the loss of links did cause a drop in rankings. Due to this, we've actually started building out natural links throuhh outreach & PR, redesigned the entire site & updated all content as well Along with ongoing contemy creation on site and off.
With those things in mind, is there anything else left? I'm just wondering if we're completely missing something and in getting desperate - is this domain just dead?
And thanks for the heads up on the hreflang, that was a little over my head.
Thanks for your feedback
-
-
A couple of things to keep in mind:
- When disavowing new inbound links, make sure that you're adding them to your existing disavow file - if you just submit the new sites, that new disavow file will overwrite the previous one and un-disavow links.
- A manual penalty is only part of the traffic/ranking loss you'll see with Penguin. Don't forget that you also lost the link value of a bunch of spammy links that previously were providing value. The penalty may be gone, but so are your links! To regain traffic levels, you'll need to build new quality inbound links, so make sure that's a big part of your strategy going forward.
I haven't tried the hreflang solution, so I can't comment on its effectiveness, but it's not something I would test unless you actually do have different English-speaking audiences.
Good luck!
-
Thank you for the lengthy response! I am not sure how to use hreflang but will look into it more this week. As for the your final question, our manual penalty was revoked in Oct. 2013 & it just hasn't performed worth a damn since. We updated all content and are working on a design refresh now in order to support responsive design - hoping that would help as we'd hate to change the domain but we're just at a loss & getting desperate. This is out only client who came to use for this type of cleanup service that has not yet recovered.
-
I have the ultimate answer for you and you will not find this elsewhere online.
I have been through this process and it was a huge pain the a$$
I spoke with John Mueller at Google for years trying to resolve our issues until one day we spoke about hreflang. At which point John said to me that it would be OK to use it. So we played with it and it turned out that we recovered IMMEDIATELY from penguin for a new domain that was hrefland linked from our original site.
So basically this allowed us to keep our original brand name site up while traffic was going to our new co.uk site (in your case .net)
Let say for instance your client is US based. Take the .com site and set it as x-default, then set the new .net site as "en" or if you want to be more specific "en-US".
All Google traffic in english or english from google.com in the US will now start flowing in your non penalised site. the hreflang simply does a swap in those search engines but it happens before the penalty is enforced. So your rankings will return right away depending on where you would now rank after the penalty is lifted and your new content will also be back to their rightful ranking positions. Basically all the suppression is gone.
Dont worry about duplicate content as hreflang handles all that. Its very common for sites to have just a few small changes such as $ to £ or a few spelling changes like color and colour. There is no downside.
If you are clever with the way you code your site you can make it a seamless transition without having to maintain code on two domains. Only took me a few hours to code something up.
The best thing is it opens the door to targeting multiple regions of the world with other languages. (When you are ready).
Hope that answers your question.
Also FYI, when did your manual penalty get revoked? It can take up to a year for your site to be ready for the refresh to consider it OK to lift the suppression. Based on that you may have not been ready for the OCT 2014 refresh and may be waiting for the next one which could be a long way away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our new site will be using static site generator which is supposed to be better for SEO?
Hi folks, Our dev team is planning on building our new marketing webpages on SSG or Static Site Generator(we are stepping away from SSR). Based on my research this is something that can help our SEO in particular for site speed (our site has a poor score).
Intermediate & Advanced SEO | | TyEl
Are there any challenges or concerns I should be aware regarding this direction? If so what are they and how can this be addressed? Thanks0 -
Disavow post Penguin update
As recent Penguin update makes quick move with backlinks with immediate impact; does Disavow tool also results the changes in few days rather than weeks like earlier? How long does it take now to see the impact of disavow? And I think still we must Disavow some links even Google claim that it'll take care of bad backlinks without passing value from them?
Intermediate & Advanced SEO | | vtmoz0 -
Interlinking sites in multiple languages
I am working on a project where the client has a main .com site and the following additional sites which are all interlinked: .com site targeting US
Intermediate & Advanced SEO | | rachelmanning888
.com site targeting China
.HK site targeting Hong Kong All sites contain similar information (although the Chinese site is translated). They are not identical copies but being shopping sites, they contain a lot of similar product information. Webmeup software (now defunct) showed that the inbound links to the main site, from the additional domains are considered risky. Linkrisk shows them as neutral. The client wants them to be interlinked and would not want to remove the additional domains as they get a good amount of traffic. In addition, the messages and products for each country domain have been tailored to a degree to suit that audience. We can rewrite the content on the other domains, but obviously this is a big job. Can anyone advise if this would be causing a problem SEO wise and if so, is the best way to resolve it to rewrite the content on the US and Hong Kong sites? Alternatively would it be better to integrate the whole lot together (they will soon be rebuilding the main site, so it would be an appropriate time to do this).0 -
Merging 11 community sites into 1 regional site
I am merging 11 real estate community sites into 1 regional site and don't really know what type of redirect should I use for the homepage?, for instance: www.homepage.com redirect to www.regionalsite.com/community-page Should I 301 this redirect? If yes, how could I 301 redirect a homepage to an internal page in my new site? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
Penguin Recovery Problem - Weird
I had an old URL and the link profile of this URL wasn't good - I had been using article syndication and Penguin threw me to the wolves. I decided to start over with a new URL and build a new natural link profile. I specifically did NOT do a 301 redirect to the new URL and did not make any request to Google to transfer domain as I didn't want old site being associated to the new one. To redirect our old users, I put a link on the old URL index page (nofollowed) that say that we have moved. I was very surprised to find that in GWT all the links of the old URL have now been associated to the new URL....why is that? I started over to have a clean natural profile and follow Google guidelines.Has anyone heard of this before? All I can guess is that Google itself "decided" to do its own pseudo-301, since the site was the same, page for page.This has Major implications for anyone attempting a "clean start" to recover from Penguin.
Intermediate & Advanced SEO | | veezer0 -
Similar sites on same IP address
Hello, A client has a small number (3) of large price comparison sites which have been launched on separate subdomains - BUT all on the same hosting IP address. The roll out of the sites was not ideal from an SEO perspective - as basically cloned versions of the sites were initially launched and indexed - and are only now being customised i.e. unique content added to each of the category and sub category pages. The first site initially got some traffic - and so did the 2nd in the early days - but then they both bombed (especially number 2). So we think there has probably been some kind of slap / sandboxing. We are starting to see some very early signs of recovery now some months after. My questions is - would it be a wise move to migrate each of the sites to a separate IP address as we start to evolve and optimise each site. Or are they ok to be left on the same hosting / IP address? The sites in question are : shop.deliaonline.com shop.ivillage.co.uk rewards.bestforfilm.com Thanks in advance for your help. Richard
Intermediate & Advanced SEO | | RichBestSEO0 -
Does Google punish sites for Backlinks?
Here is Matt Cutts video, for those of you who have not seen it already. http://www.youtube.com/watch?v=f4dAWb5jUws (Very Short) In this Video Matt explains that Google does not look at backlinks. Many link spamming sites have detected, there have been many website receiving warning messages in their Google web tools to deindex these links, etc.. My theory is that Google will not punish sites for backlinks. However, they manually check for "link farming sites" and warn anyone affiliated with them, just in case these links were built from a competitor. This way they can eliminate all the "Bad Link Farm" sites and not hurt anyone who does not deserve to be hurt. Google is not going to give us all their information to rank, they dont want us to rank. They want us to PPC. However, they do want to have the best SERPs available. I call it Google juggling! Thoughts?
Intermediate & Advanced SEO | | SEODinosaur0