Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
-
Again I am facing same Problem with another wordpress blog.
Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain.
Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq
That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites.
site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error
My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site.
As per Sucuri.net Senior Support
It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results.
As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time.
Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again.
Looking forward for possible solution
Ankit
-
Hi Ankit,
I already mentioned the files that were changed in our case: header.php, functions.php and some entries in DB.
Here are some links for step by step malware removal (don't depend on plugins for malware removal entirely)
http://securepress.org/tutorial-how-to-remove-malware.php
https://www.optimizesmart.com/malware-removal-checklist-for-wordpress-diy-security-guide/
http://hasibul.info/blog/2015/11/25/how-to-remove-malware-from-wordpress-sites/I hope this helps if the problem persists then contact me again.
Regards,
Vijay
-
Hello Vijay,
Did you face exact same issue? Also Is it fixed now? If so which particular files had issues & what kind of codes were added so I can look for similar codes
-
Hi Ankit,
We have faced a similar problem with one of our clients, it's almost always the malware code which remains hidden in functions.php or header.php or the DB or cache files which creates the problem.
Scuri or most of the security plugins will show the file as clean but the code remains hidden somewhere in the website, and re-emerges on when spiders/carwalers hit the website ( and for some users as well).
Please get the website code and DB reviewed properly.
It would help, I hope this works out for you.
Best Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should client sub domains appear in Google or not?
I have a client who has created a number of login pages for their clients (eg. clientA.domain.com, clientB.domain.com). They are all password protected. Just wondering if this has any impact on SEO, good or bad?
Intermediate & Advanced SEO | | muzzmoz1 -
Domain Name Change-Negative Ranking Effect?
I am considering redirecting my domain name from www.nyc-officespace-leader.com to www.metro-manhattan.com. My company name is Metro Manhattan Office Space, Inc. so the new domain will be more consistent our identity. The Metro domain was registered with GoDaddy five years ago but has only been used for email and for forwarding (entering www.metro-manhattan.com will forward visitor to www.nyc-officespace-leader.com). What is the likely hood that redirecting to the Metro-manhattan.com domain will result in a drop in traffic and ranking? I asked this question a year ago and the results were mixed. But one year is an eternity for Google. I am hoping that re-directs work better now and that if this is implemented correctly there will be no ranking/traffic/domain authority loss. Thoughts?? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Implications of extending browser caching for Google?
I have been asked to leverage browser caching on a few scripts in our code. http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds) http://www.google.com/jsapi (1 hour) https://www.google-analytics.com/plugins/ua/linkid.js (1 hour) https://www.google-analytics.com/analytics.js (2 hours) https://www.youtube.com/iframe_api (expiration not specified) https://ssl.google-analytics.com/ga.js (2 hours) The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data. Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis. Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you. There is also the concern that Google recommends not caching outside of their settings. Any help on this is much appreciated. Do you see any issues/risks/benefits/etc. to doing this from your perspective?
Intermediate & Advanced SEO | | chrisvogel0 -
Redirecting Domain / Maintaining Keyword Ranking
Right now we have two sites: "our-company.com" and "cool-widgets.com." We rank high for "cool widgets" searches due to our keyword-friendly URL, but we're merging everything into our newly-redesigned company site. Should we redirect the old "cool-widgets.com" homepage to "our-company.com" (to directly transfer the old PR and links), or would it be more prudent to redirect the old homepage to "our-company.com/cool-widgets" to keep the "cool widgets" keyword in the URL? This option seems like it would be good for maintaining organic search results, but it wouldn't pass the strong link backbone to the new site's homepage.
Intermediate & Advanced SEO | | versare0 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Keywords Ranking Varies When Search changes Location/City (Not Google Places)
We have a client that are ranking well on most Australian cities for competitive keywords except Google Sydney. If you toggled the cities on the search field when you search for a keyword, their places are almost exactly the same except for Sydney on which they can't be found at all in the Top 100 results. The keywords are not city specific, they are general commonly searched keywords about health. This is not a Google Places issue. The search result shows the right landing pages of the site for their respective keywords. Any ideas or experience on this kind of situation. Much appreciated Louie
Intermediate & Advanced SEO | | louieramos0 -
Domain change - slow & easy, or rip off the bandaid?
We are laying the foundation for a domain change. I'm gathering all of the requirements listed from Google (301's, sign up the new domain with WMT, etc), customer communications, email system changes, social updates, etc. But through everything I've read, I'm not quite clear on one thing. We have the option of keeping our current domain and the new domain running off the same eCommerce database at the same time. This means that we have the option of running two exact duplicates simultaneously. The thought is that we would slowly, quietly turn on the new domain, start the link building and link domain changing processes, and generally give the new domain time to make sure it's not going to croak for some reason. Then, after a week or so, flip on a full 301 rewrite for the old domain. There are no concerns regarding order databases, as both domains would be running off of the same system. The only concern I have in the user experience is making sure I have internal links all set to relative, so visitors to the new domain aren't flipped over and freaked out by an absolute URL. I'm not confident that this co-existing strategy is the best approach, though. I'm wondering if it would be better from an SEO (and customer) perspective to Have the new domain active and performing a 302 redirect from the new domain to the corresponding page on the old domain When we're ready to flip the switch, implement the 301 redirect from old to new (removing the 302, of course) at switch time. Any thoughts or suggestions?
Intermediate & Advanced SEO | | Goedekers0 -
Could ranking problem be caused by Parked Domain?
I've been investigating a serious Google ranking drop for a small website in the UK. They used to rank top 5 for about 10 main keywords and overnight on 24/3/12 they lost rankings. They have not ranked in top100 since. Their pages are still indexed and they can still be found for their brand/domain name so they have not been removed completely. I've coverered all the normal issues you would expect to look for and no serious errors exist that would lead to what in effect looks like a penalty. The investigation has led to a an issue about their domain registration setup. The whois record (at domaintools) shows the status as "Registered and Parked or Redirected" which seems a bit unusual. Checking the registration details they had DNS settings pointing correctly to the webhost but also had web forwarding to the domain registrar's standard parked domain page. The domain registrar has suggested that this duplication could have caused ranking problems. What do you think? Is this a realistic reason for their ranking loss? Thanks
Intermediate & Advanced SEO | | bjalc20110