Hello Vijay,
Did you face exact same issue? Also Is it fixed now? If so which particular files had issues & what kind of codes were added so I can look for similar codes
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Founder & CEO
Company: Wisden Infotech
Favorite Thing about SEO
Link Building
Hello Vijay,
Did you face exact same issue? Also Is it fixed now? If so which particular files had issues & what kind of codes were added so I can look for similar codes
Again I am facing same Problem with another wordpress blog.
Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain.
Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq
That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites.
site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error
My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site.
As per Sucuri.net Senior Support
It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results.
As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site
[http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz)
It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query:
[https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22
But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site.
We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation:
It's not a hack of your site, so you should focus on preventing copying the pages:
1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22
Hopefully they clean their site up and your site will have the unique content again. Here's their email
flang.juliette@yandex.com
2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked.
3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202
This .htaccess snippet should help (you might want to test it)
#--------------
Order Deny,Allow
Deny from 149.202.120.102
#--------------
4\. Use rel=canonical to tell Google that your pages are the original ones.
[https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en)
It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't.
I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize.
The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time.
Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again.
Looking forward for possible solution
Ankit
.htaccess file has nothing but
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>
Installed Plugins
Yoast SEO, Google XML Sitemaps, Akismat, Udinra All Image Sitemap, Social Share Bar (Digg Digg Alternative), Jetpack by WordPress.com, AuthorH Review.
Apart from Yoast, it seems nothing can block site, and Yoast settings are fine, just disabled tag indexing & subpages along with author archive.
Problem is something else I guess
Hey Alan,
Do let me know if you find some solution or identify the problem.
I think that index.php disallowed was not an issue, I took suggestion and removed it but many sites disallow index.php via robots.txt to avoid duplicate content issue in site.com & site.com/index.php
here is an example - http://www.shoutmeloud.com/robots.txt
Still I did it about 10-12 days ago, fetched & submitted to index & also put rendering request.
Attaching current Screenshot of last rendering request.
I think some other issue, what's your view on that info:site.com showing some other hacked sites, how's this happening & sites are also changing. Its different in India, Different in US.
I will try to do so, thanks for your tip of keeping the posts privately.
However 1200 posts, its a big task to do.
Can anyone recall something similar with positive results?
Yes, almost 90% posts are not getting traffic, some posts are event posts, so they get some traffic during event & nothing before or after that.
What's best way to Remove Posts, Delete & Request Webmaster's tool to deindex & Remove cache version of site? or something else?
Andy, You are asking questions & I am looking for answers..
I am ok if I remove all useless posts, which means almost clearing entire blog. I am also willing to contribute more on this site but thing is does it worth? Will google really start picking my blog.
What if I Remove almost 90% of posts & Just leave 10% Posts with meaningful content?
Also should I do some link building etc?
Andy Problem is Most of the articles are Short News, Don't know what can be done for that. Or I may Deindex all those posts, It will be approx 1000 posts ( almost 85% ) of total posts.
Traffic Driving posts are only few, however I have been posting very less like not even 10 posts in last year.
As said before, Have not made backlinks for this site at all, all links are natural. I was never hit by penguin, it was Panda all the time
Because your both pages have exactly same description, it's likely to show duplicate content issue.
It will work against your SEO & it may impact your search engine rankings as well, You should either write fresh description for every page, not just change country & keep the description same.
Andy Problem is Most of the articles are Short News, Don't know what can be done for that. Or I may Deindex all those posts, It will be approx 1000 posts ( almost 85% ) of total posts.
Traffic Driving posts are only few, however I have been posting very less like not even 10 posts in last year.
Yes, almost 90% posts are not getting traffic, some posts are event posts, so they get some traffic during event & nothing before or after that.
What's best way to Remove Posts, Delete & Request Webmaster's tool to deindex & Remove cache version of site? or something else?
I will try to do so, thanks for your tip of keeping the posts privately.
However 1200 posts, its a big task to do.
Can anyone recall something similar with positive results?
Proud Founder of SeoEaze, I am always ready to learn Advanced SEO & Link Building. Love to Travel around the world whenever I can.
Looks like your connection to Moz was lost, please wait while we try to reconnect.