Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
-
Hi -
Over the past few months, I've identified multiple sites which are linking into my site and creating fake pages (below is an example and there's over 500K+ of similar links from various sites}. I've attempted to contact the hosting companies, etc. with little success. Was wondering if my best course of action might be at this point: A) which servers (or IP address). B) Use the Google Disavow tool? C) both.
example: { http://aryafar.com/crossings/200-krsn-team-part19.html }
Thanks!!
-
Few things... make sure you have a sitemap that is always upto date and submitted to search engines - this will encourage them to view your content first and recognise it as belonging to your domain.
In addition to this put links in your content to other parts of your site, if it gets scraped it will probably be with the links in it and so anyone actually wanting real content can get through.
If there are thousands from the same domain coming to your site, disavow the base url and also report that url for spam (it's your copyright). In fact if you notice a small site scraping you, do that after you've tried to contact them.
If this still doesn't stop them look at your logs and see where their crawlers are coming from and block their IP's.
On one of my old site I blocked the whole of China at one point because it was constantly being barraged by scrapers and people trying to guess account passwords.
Hope that helps
-
OK, so they're scraping much of your site, and then adding in their own garbage etc.
I wouldn't worry about the occasional instance of this, unless you do see a penalty. For the more egregious ones, where they're building a ton of links, I'd throw their domain in your disavow list.
-
Hi Michael -
Sorry for the confusion...My site is HHisland.com and sites like the example below are linking in and creating false pages...Most are adult sites, etc.
Thanks again -
Billy
-
Hi Billy,
I'm not sure exactly what's going on here. Is it YOUR site that's getting hacked, or is it other sites getting hacked and linking to you, and you're worried that the "bad neighborhood" links will hurt you?
Michael.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
Website blog is hacked. Whats the best practice to remove bad urls
Hello So our site was hacked which created a few thousand spam URLs on our domain. We fixed the issue and changed all the spam urls now return 404. Google index shows a couple of thousand bad URLs. My question is- What's the fastest way to remove the URLs from google index. I created a site map with sof the bad urls and submitted to Google. I am hoping google will index them as they are in the sitemap and remove from the index, as they return 404. Any tools to get a full list of google index? ( search console downloads are limited to 1000 urls). A Moz site crawl gives larger list which includes URLs not in Google index too. Looking for a tool that can download results from a site: search. Any way to remove the URLs from the index in bulk? Removing them one by one will take forever. Any help or insight would be very appreciated.
Technical SEO | | ajiabs1 -
Adding /es version to google search console
I have a Wordpress site and we are using WPML for making it bilingual. The domain is: https://www.designerfreelance.net and for Spanish https://www.designerfreelance.net/es Do I have to add to Google search console the /es version? And the no www: https://www.designerfreelance.net https://www.designerfreelance.net/es https://designerfreelance.net https://designerfreelance.net/es and do I have to add the non ssl version? http://www.designerfreelance.net http://www.designerfreelance.net/es http://designerfreelance.net http://designerfreelance.net/es Thanks
Technical SEO | | Trazo0 -
Hosting Sites under same IP / subdomain usage?
Hello everyone! The company I am working for is working on selling websites templates to clients in the near future. In terms of SEO purposes, would it be detrimental for our clients if we hosted all of these sites under the same server/IP? Also, in the past we've sold sites under a domain we own, adding them on as a subdomain. For example, we would own yourflowers.com, and if Mark's Flowers wanted a site, we would give him: marksflowers.yourflowers.com These sites are going to be the same niche as we are industry specific (example would be, we sell website templates specifically designed for flower shops around the United States). I want the best possible SEO experience for our clients and I believe using subdomains and hosting under the same server IP can be detrimental, but I wanted to see what the Moz community thinks of this. Any feedback is appreciated! Thanks
Technical SEO | | KathleenDC0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Does my "spam" site affect my other sites on the same IP?
I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it. While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me. Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues? Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
Technical SEO | | eglove0 -
Image Height/Width attributes, how important are they and should a best practice site include this as std
Hi How important are the image height/width attributes and would you expect a best practice site to have them included ? I hear not having them can slow down a page load time is that correct ? Any other issues from not having them ? I know some re social sharing (i know bufferapp prefers images with h/w attributes to draw into their selection of image options when you post) Most importantly though would you expect them to be intrinsic to sites that have been designed according to best practice guidelines ? Thanks
Technical SEO | | Dan-Lawrence0 -
InSite Linking Best Practices
When creating links within your website, is it bad to have a anchor text link pointing back to the same page? Say the page the homepage is optimized for "credit cards". If I have a "credit cards" anchor text link on the page the link points to, is that bad practice? Secondly, if it's better to put that link on a different page, wouldn't I be placing a keyword that's optimized for a different page on the wrong page? (hopefully I'm making sense) Any guidance would be greatly appreciated!
Technical SEO | | MichaelWeisbaum0