Help - Lost Ranking - What did I screw up?!
-
Hi,
We're working with a local service provider with a location specific keyword (not a real example: "orlando plumbing contractors").
Background:
In recent history the client updated a new site design and upgraded from Joomla 1.5 to Joomla 2.5. Of course there were duplicate content issues which have been resolved with the help of AceSEF. Duplicate content, title tags, and other content issues are handled as soon as they appear in GWT or MOZ.Additionally, a high number of backlinks were lost when the latest Google update hit. Many of these sites were of sites that no longer existed or were spammy and flushed out. Some were lost due to the previous SEO firm literally removing backlinks and switching them to their new client (seo firm was putting all of the work the client paid for under their name to control everything).
Current Situation: The backlink loss seems to have been stopped (hopefully) because we are using a new strategy that relies solely on the quality of the links, surrounding text, varied anchor text, relevancy, etc.) However, we tried an experiment on just one of the clients keywords. That experiment seems to have blown up in our faces evidently.
The landing page for the location specific keyword has dropped from the index completely (it seems), but only when searching broad. When using exact match with quotes like the example quoted above ("orlando plumbing contractors") the landing page appears, but several ranks lower. We were ranked yesterday 6/23/13, but as of today 6/24/13 are no longer ranked.
On broad matches, non-relevant sites and even a site that shows only a broken server configuration is outranking the client (they appear for the broad search, but the client does not).
What Was Done
We recently created a press release for and posted it on a press release site. We then created a link back to the landing page (exact match anchor text). We posted the PR article to several social sites (Google plus, folkd, delicious, twitter, stumble upon, diigo).We also created a blog article (on-site) on site for that, creating links back to the landing page (the links all had exact anchor text). We posted that blog article to social news sites (facebook, stumble, delicious) and included a ping. The PR article was manually rewritten and posted to the PR site (we had to make 2 versions of the PR; one for the blog and one for the PR site).
The Result
The client ending up dropping off the broad search rankings, but only slipped a few for "exact match" (with quotes). The PR article that was created is now ranked on page 3 for the board keyword and is still beat by non working sites.We suspect that the exact anchor text could be causing this problem. Anyone else have an idea (we're scratching our heads and trying not to freak out at the same time).
-
I also wanted to put in a little update that we've noticed in this niche particularly. When we follow the basic outline of the strategy above, we notice that the onsite article gets ranked quickly, while the target landing page drops off out of results for about half a day. No later than the next day is the landing page back in place and the onsite (blog article if you will) is removed from rankings.
I suppose this is just the Google shuffle or something similar happening. Fortunately we've removed all but 8 errors (out of 3000) and we're just waiting for GWT to update to reflect that information.
-
I just saw that as well - good eye!
-
Todd,
I really think you're right on every point there.
1. We have been looking at the backlink profile pretty good. It appears that the first SEO company was just starting out with this site so I see a quite a few paid link directories that were popular a few years back. Fortunately, a big chunk of them have stopped out of violation and the links are slowly being de-indexed altogether. I imagine that the both the quality and loss of the links is hurting us while things sort and readjust. Once all of the links are removed from the index, I believe we can cross this portion of the list (for now - must remain ever vigilant).
2. In progress - takes a bit of time. We're also trying to make sure that the citations are built on local IPs as well.
3. I agree - fortunately the anchor text is varied so I believe that we will be ok here as well. We are updating any gregarious usage of exact match anchor text as well.
4. This was one of the few things that the last SEO company did correctly. The internal link structure is definitely varied.
Thanks for your help on these suggestions. Our client has a pretty good decent branding scheme so its a nice way to build links with that anchor text and variations to use.
-
Google just updated thier link scheme document:
https://support.google.com/webmasters/answer/66356?hl=en
Links with optimized anchor text in articles or press releases distributed on other sites. For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress. -
I get suspicious anytime I hear about exact match press release links. In my experience they can do little to help you, and everything to hurt.
Whenever I submit a press release I always use url links (example.com) or generic anchors, never a "money" term. This is because Google views press releases as non-editorial, and tends to devalue most links they see coming from these sources.
If that exact match link that you embedded in your press release is widely distributed across multiple sites and distribution platforms, then you're in even deeper.
There's been some talk by Google of late that the newer class of penalties and algorythmic filters can act more at the page level (whereas before they tended to be applied broadly to the entire domain.) This means if you're trying to clean things up, you can start by addressing the one page with the over-optimized links and working your way out from there.
I'd start by trying to remove the links, then possibly using the disavow if you aren't successful. Finally, if the page doesn't have a very strong backlink profile you can simply remove it and serve a 404 or 410 status code, and then remove it from the index using the Remove URL tool in Google Webmaster Tools.
Not sure if this relates to what is happening on your site or not - it's pure speculation on my part. Regardless, I hope it helps in some small way.
Cheers,
Cyrus
-
Hi Imedia, thanks for explaining your situation. Hopefully I can be of help:
Inbound link profile
1. In terms of inbound links, my concern would be the quality type of links that can be 'pulled and re-plugged'. Generally, it is difficult to do this with links that are earned or naturally given, as they usually only relate to one client and assets and resources that client has (great web design, awesome guides, white papers, blog posts).
I would focus on first ensuring that the back link profile is natural and correct. Here is a great guide in my opinion to removing poor links and disavowing them. Remember to always first contact webmasters and request irrelevant and low quality links removed first. Google wants to see that you have acted on links.
2. If you want to rank in Google Local: Build local citations. Darren Shaw from Whitespark has plenty of fantastic knowledge published on the Internet and on his blog. Mike Blumenthal is also highly qualified in local.
3. I would recommend staying away from anchor text in press releases altogether. My take on press releases is that the value of the link is really directly proportional to the quality / newsworthiness of the story itself. Great stories are shared and linked to, improving the social signals and engagement signals and inbound links to that piece. Using exact match anchor texts in press releases is probably not necessary and might be working against you because it makes it more difficult to assess over-optimization issues if they happen.
Try and ensure any exact match links are in natural, editorial content, so that you can try to rule out which particular pieces might be counting for and against you when diagnosing over optimization and position drops.
4. Also linking internally with Exact Match can do double damage to rankings if particular thresholds are reached. Because Google updates its complex algorithm a lot, it is difficult to understand what those thresholds are. You might consider utilizing a different internal anchor text in this situation and testing it over a long period. You might also consider changing the press release link if possible. However, it is important to understand that changing links can also send signals to Google that can cause ranking flux (commercialize anchor, effect in SERPs, change in anchor = another net change)
A safe bet for internal links is to rarely use exact match internal anchor text, and opt for natural text, mixed brand and mixed commercialized text, and anchor to text that isn't commercial at all (click here, buy it here, learn more here) Focus on building internal links that are driven from content within the website that is well linked to.
In my opinion this also should apply to inbound link anchor text profiles as well, but you will want to study the top 10-20 competitors in this niche (with a close eye on 1-2-3) in order to better understand both individual and aggregate anchor text usage.
Hope this helps!
-Todd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Need help with Robots.txt
An eCommerce site built with Modx CMS. I found lots of auto generated duplicate page issue on that site. Now I need to disallow some pages from that category. Here is the actual product page url looks like
Intermediate & Advanced SEO | | Nahid
product_listing.php?cat=6857 And here is the auto generated url structure
product_listing.php?cat=6857&cPath=dropship&size=19 Can any one suggest how to disallow this specific category through robots.txt. I am not so familiar with Modx and this kind of link structure. Your help will be appreciated. Thanks1 -
Site Migration and Traffic Help!
Hi Moz, I recently migrated my website with the help of an SEO company using 301 redirects. The reason for the move was to change our CMS from .aspx to Drupal/Wordpress. The homepage (www.shiftins.com) and the blog (www.shiftins.com/blog) were the only two pages that kept the same url. Everything else was redirected. It's been about two months since the redirects were completed and traffic has dropped off about 90%. I'm starting to worry that something was not done properly and my traffic may never return. The process for the redirects seem correct when I checked the work the SEO company did. All pages were duplicated, redirected to individual pages, then the old pages were de-indexed. Are there any insights the community can provide? Please help!
Intermediate & Advanced SEO | | shictins1 -
Strange rankings on new website
HI All My website is 10 years old, and has decent rankings. The domain is www.advanced-driving.co.uk I have recently had a major overhaul of the site, before it was very outdated, with lots of duplicated content. My main keywords are "advanced driving course" and "advanced driving courses" both of which I am on page 1. However, since I have been live with new site - (5 days) I am not ranking for some easy win keywords. I have submitted new content thought webmaster tools, and whilst some content is ranking, others are not. The content not ranking is fresh and unique ( have used copyscape on all new pages). For example my homepage is on page 1 for "advanced driving courses london" - around rank 6. So I hand made some content titled advanced driving courses london to provide more of an exact match, outlining our courses in London and the routes we take - http://www.advanced-driving.co.uk/defensive-advanced-driving-courses-london/ However, this page which is unique does not rank at all....I have done this with another website and it worked well, but google is not understanding this at all. Also I am now on page 1 for "advanced driving course" but not for "advanced driving courses" - well I am but the page for the plural keyword is a page not really related - surely Googles semantic search should realise course and courses are the same! I suspect that Google is still getting used to my new website? No errors or anything in Webmaster tools... Can anyone confirm this - or outline if I have done something awful..!! Thanks Rob
Intermediate & Advanced SEO | | robert780 -
Website Suddenly dropping rank
Morning Moz Fans: My URL is: http://goo.gl/Dhbjwj According to MOZ, which we are tracking this URL with, somewhere between the 3rd Feb and 10th the domain went from being fairly well indexed to being dropped to pages further back than 6-7, for pretty much everything, even the company name was only registering at the bottom of page one. Around this time we were transferring the website from .php into wordpress, so we were creating new pages, called by the same names and all the same content but we created the wordpress area in a sub domain of the website. Again around this time we had an issue with the blog area and had to take it down for 4-5 weeks due to some errors which meant google wouldn't have been able to crawl these pages properly, but the rest of the website was up and running. We also discovered recently that the company have and use this domain http://goo.gl/5JvDUH So my question is, what do you think caused the problem? has it been premaritally penalised? is there a way I can get google to specifically look at it and is there any more i can do?Thank you for your help
Intermediate & Advanced SEO | | popcreativeltd0 -
Bing and Yahoo Ranks work, google ranks not happening
Bing and Yahoo Ranks work, google ranks not happening please help
Intermediate & Advanced SEO | | Djdealeyo0 -
Seasonal links, seasonal ranks
As the garden season begins to wane I notice yet again how my ranking for some garden specific terms - eg ' garden tealight holders' start to rise again.Since I am doing nothing much I can only assume that my competitors have moved their focus to more winter based merchandise. Does anyone have a good understanding of how some websites are able to acheive high rankings during peak season only? I am assuming they are buying advertising (with the follow) for say 3 months before the season peak and manipulating internal linking to direct link juice from one section of the website to the other. Is this strategy risky. Has Google ever made mention of this issue?
Intermediate & Advanced SEO | | GardenBeet0