Since two years i lost place in google search
-
Hi,
First excuse my but english. I am from France.
Since two years i have lost place for the most important of my keyword in google search.
The loss is gradual. It's been more than two years as I do not do link building. I do not know what to do. I just read your article on link building. (http://moz.com/blog/beginners-guide-to-link-building)
I would like if possible for someone to take a look at my website to see what is wrong and what i have to do do. For reasons of discretion I do not want to link the site name here, but you can see the domain name by following this link. http://riador.com/lien.html
Thank you to all. -
Hi There
There certainly looks like there's an issue with links. Here's a few resources on cleaning them up;
- http://searchengineland.com/five-steps-to-clean-up-your-links-like-a-techie-166888
- http://moz.com/blog/google-webmaster-tools-just-got-a-lot-more-important-for-link-discovery-and-cleanup
- http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/
- http://savvypanda.com/blog/guide-how-to-use-google-disavow-tool.html
Essentially you need to remove and/or disavow as many bad links as you can.
I'm also going to guess that you could use a design upgrade. I don't have any data to back this up, but it's just my opinion looking at the site. A poor design could also bring Panda into play. I've seen some sites get a bump after a design upgrade.
-
Thanks for replay,
I have webmaster tools and there are not manual action in my website.
In moz open site explorer i only see 1844 links. LJ Digital can you tell me what tools you user, because you find 62,000 links from only 235 domains.
I forget to tell that in my keyword i was in the first (and top) page of google but now for all keyword i am in the page 2 or 3 or 4.
Thanks
-
LJDigital is spot on.
First you need to create a web master tools account - if you don't have one already.
Here is how: https://support.google.com/webmasters/answer/34592?hl=fr
After that look into Web master tools -> Search Traffic -> Manual action and see if you have something there.
That would be a good start.
-
Have you ever purchased links, participated in link schemes or used an SEO company? You have over 62,000 links from only 235 domains, this looks suspect to me and it probably does to Google too.
If I was you, I'd track down all of the bad links that point to your website - Moz's Open Site Explorer is a good start to find these - and use Google's disavow tool to get rid of them. If you log into Webmaster Tools do you have any warnings/penalties in there?
You may be looking at a few months before your site starts to recoup but these are the steps I'd take. I'd be more than happy to discuss in further detail if you wish to message me or contact me on social media.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will google respond to allowing multilingual search terms for a single language website?
We would like to set up a website in English language only and promote this in various European countries. As said the website will only be available in English language, but we will keep translations (google translate) in backend. When a user in France then enters search query in French language in browser, a search can be done in French content, but we will present relevant content in English. Does anyone have any experience with that? Will it be allowed given the fact that the result (in English language) will probably not include any of the terms that was searched on (in French language).
Intermediate & Advanced SEO | | hansdef0 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
April Google Update?
Since April 16 (when Jews ate Matzah) Google hurt one of our clients badly. They are well-known and beloved brand with hundreds of employees and locations across USA.
Intermediate & Advanced SEO | | Elchanan
I can’t see any signal of organic update, or penalty (neither Google Places). No message on GWT Nothing has been changed on and off site. All keywords' ranking are looking like this All tools showing good analysis: MOZ, Barracuda, MajesticSeo Content is good and not duplicated, etc. Do one of you is aware of significant Google update?
What do you think/suggest?0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
How often is a Places search run - ie via the Google menu option
Hi guys Does anyone have a gut feel for how often the Places search option is used via the left hand side menu of Google? I have a non-slip solutions (flooring/decking etc) client. Google does not provide a 'local search' type results page even when the location is added to the search term. So I am thinking I should prioritise energies on the website and not local search activities (like citations). But I am wondering how many searches are conducted using the Places option. Any views? Many thanks Wendy
Intermediate & Advanced SEO | | Chammy0 -
Does Google punish sites for Backlinks?
Here is Matt Cutts video, for those of you who have not seen it already. http://www.youtube.com/watch?v=f4dAWb5jUws (Very Short) In this Video Matt explains that Google does not look at backlinks. Many link spamming sites have detected, there have been many website receiving warning messages in their Google web tools to deindex these links, etc.. My theory is that Google will not punish sites for backlinks. However, they manually check for "link farming sites" and warn anyone affiliated with them, just in case these links were built from a competitor. This way they can eliminate all the "Bad Link Farm" sites and not hurt anyone who does not deserve to be hurt. Google is not going to give us all their information to rank, they dont want us to rank. They want us to PPC. However, they do want to have the best SERPs available. I call it Google juggling! Thoughts?
Intermediate & Advanced SEO | | SEODinosaur0 -
Google Places optimisation for service franchise, 150 franchisees with no physical addresses?
So we have a client who is a plumbing franchise with about 150 franchisees across the country. Because its a plumbing franchise the businesses don't have street addresses (apart from the franchisee home addresses but we don't want to use those) We used to have bulk uploaded listings for the franchise locations and used the GPO address is the suburb/city as the address and got away with this fine for years. Google has copped onto this and asked for reverification of the listings by post now. So my question is what's the best way to optimise places for 150+ locations. As a quick fix, we're going to add a new places location as the master franchise HQ office (address exists). We can then add all the suburbs/areas serviced into this location which may or may not show up for local searches in those areas. We could potentially verify all listings by mail by using private mailboxes but mail verify on a mass scale like that is likely to be flaky not to mention an admin nightmare. Does anyone have an experience with this and how they got around it?
Intermediate & Advanced SEO | | Brendo2