How do I stop someone from hijacking my site?
-
I had lost a lot of search engine rankings & it almost put me out of business. I worked a lot on the site in April and in May my sales rebounded 37%+ almost back to where they typically are. I have blamed Panda and saw another Panda update in May and my traffic is declining again. Today I happened to decide to check what pages were being indexed and I also noticed that when I published an update this morning it said the page "<cite class="_Zd">www.cheaptubes.com/index2.asp</cite>" exists on the destination server but not in current site and it asked if I wanted to delete it. I said no at the time but am going to delete it right after this question. I went to google & put "site:cheaptubes.com" into the search. My website has about 50 pages but google has "About 2,160 results" @ 10X per page. The first four pages are mostly my pages but I noticed even on the first page, 3rd result was for Nike shoes and it isn't my site. Someone has hacked my site and put up over 21,000 pages! That must be the real reason behind my website problems, right? How do I stop this from happening again? Is this having the negative effect that I fear it is?
-
ok, thx for that. I will try it. I know the pages that need the 301 from the others. all the bad pages have index2.asp in the names. I have changed my ftp and login passwords
good to know the 404 don't count against me
-
I would definitely make sure not to 301 redirect any of the bad pages. You might get penalized for that.
They won't count the 404's against you, I would change them to a 410 error though for those pages. Then they will drop out of your GWT and the Google index quicker.
-
I went to GWT today and now that I deleted the 2 malicious pages, my site now had 1950 404 errors of which only 15 are legitimate that I need to 301. I'm not sure how to handle this now. I read that google doesn't count 404 or 410 against you but 1950 pages is a lot. I would like to deal with it. I thought about wiping out the site and uploading a new version of it but I think it will have the same errors. Any advice?
http://www.cheaptubes.com/: Increase in not found errors June 20, 2014
Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site's pages.
Recommended action
- Check the Crawl Errors page in Webmaster Tools.
- Check that pages on your site don't link to non-existent pages. (If another site lists a broken link to your site, Google may list that URL but you may not be able to fix the error.)
-
Thank you for your help
-
Interesting, it sounds like they had a long term plan. Good thing you found about it in time.
I think the plan sounds good. I would definitely get off front page, it has been out of development for a while so there are vulnerabilities that have not been patched in years.
You might look into a cms like Wordpress or concrete5 to make things easier for you with transitioning the site. Then you would have to only learn minimal html / css, and could focus more on the content.
Good luck to ya.
-
Thank you so much. I changed my main login & FTP passwords today. I found that they had changed my email address to info@chatfieldstatepark.org & the last name on my account was changed to Puraty. I suspect it might be networksolutions who got hacked but i don't know for sure. I updated and scanned my computer last night and it is virus/malware free.
I am on shared hosting. I need to get my site updated and off frontpage & then off network solutions. You helped me back in April when i was trying to get my rankings back, thx. I have an interim plan.
1- I will use filezilla to FTP and it shouldn't have the same vulnerabilities as frontpage as it can be updated.
2- My theme designer said the theme doesn't rely on FP extensions. With this in mind I plan to download an HTML editor and can hopefully do future editing from there.
I think this is my short path to getting off frontpage. Does that make sense? I do realize I need to get from HTML2 to HTML5 or something more modern but I'm tryimng to handle this while juggling my other responsibilites in my 80hr work week.
-
Good that the site is clean. What scurri and programs like that do is analyze the site (your real site, not the made up pages) for malicious code. That means all the public facing files should be intact. Also GWT is google webmaster tools, if you passed the scurri check you should be fine there.
Since it sounds like you are on shared or managed hosting I would send a support email to your host and let them know the issue. They might be able to see where someone got in and when it happened, it is worth a shot at least.
What platform are you running on your site? Is it a cms or a custom platform?
More than likely the reason that Moz never detected the pages is because it is a crawler. It starts with your home page and follows every linking page on your site, if the pages were "orphaned" as it sounds like, the crawler would never have picked them up.
-
sucuri says my site is clean
-
Thanks you for your response. I had thought of changing my FTP password but the hosting co's servers were down for maintenance. They blamed it all on me. I did delete the index2.asp page and all the links are broken. I had deleted pages in April when cleaning up the site & I think I deleted this then but didn't know it. I do have a restorable version and I will pursue this tomorrow as well.I will check about the malware, thx as I wouldn't know where to do that. I have been having email problems for some time. I have moz scan my site every week (started in April) and they never detected 10,000 extra pages. I suspect I deleted it in April & they got back in. Maybe I should start selling Nike? : ) IS GWT google webmaster tools?
-
Change all of your passwords is the first thing you should do. Then you need to examine the server logs to see how they got in. I would check the ftp access logs first. Hopefully you have logging turned on. Then with those logs I would search for lines that are not your ip address. If you are on a static ip and you have had the same ip for a while it should be a lot easier. You will be looking for the other ip address. If you cannot find that the server was accessed from another ip address through ftp, then the next option is to look at the code. There might be an exploit in your site that will allow for it. One thing I would do is look at the files that were added via ftp, they will hold a time stamp on them. You can try to cross reference that time and day with the ftp log. If there is at those times (remember your server might not be set to your time zone) then start looking through the site for a "connector" file. It would have been the first that they created, it is basically a bot file that can create files on your server. If you can find that, check that time stamp against the log.
If you have a restorable version of the site, I would consider doing it. I would also see if your site is label as having malware on it. You can use GWT and sucurri to do that.
As for possibility hurting your rankings, yes, definitely. I would get the issue cleaned up, see if there is a pattern for the files that you can redirect to a 404 page and also the robots.txt file as well. I would also check GWT and see if you have a penalty as well. But I would do all of this ASAP if I were you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to set up a wordpress site so that the content is changed based on a location?
I have a site I am building that has information that shifts based on weather conditions and location. Is there a way to have information sort based on a users geographic location? I am trying to figure out the best way to do this.
Content Development | | Ron_McCabe0 -
Is Publishing Content from a Book to your Site Considered Duplicate Content?
It is a book we don't own, either. Would you need to somehow find the original and rel=canonical it? Or is this just all around bad to do? Thanks.
Content Development | | ThridHour0 -
Suggest Me About my Site Content
I have build a " Healthy Breakfast " related website...I have some post on my site related "health breakfast" keywords.But i am not sure is my Content is Doing well to beat my Competitor or what type content i should write My Competitor for " healthy Breakfast" keywords are : http://www.realsimple.com/food-recipes/recipe-collections-favorites/healthy-meals/breakfast-to-go-10000001047596/index.html http://www.eatingwell.com/nutrition_health/nutrition_news_information/healthy_breakfast_foods_for_weight_loss http://www.womenshealthmag.com/nutrition/healthy-breakfast http://www.fitnessmagazine.com/recipes/healthy-eating/nutrition/the-you-can-do-it-diet-breakfast-choices-/ http://www.foodnetwork.com/topics/healthy-breakfast/ Actually I am confused what i should write to keep my traffics happy and get Good Rank in Google Let me know if you have any more info thanks
Content Development | | seolearner0070 -
URL structure on moving from old nukedit site to Drupal
I am currently rebuilding a site in Drupal. The old site was in Nukedit and all the urls end in .asp. Given the new urls will not end in .asp is there any point in matching the rest of the url or will using 301 redirects (to better seo'd pages) be sufficient to minimise any (temporary) ranking loss?
Content Development | | chunki0 -
How to free articles sites work when they offer duplictae articles
Hi, i am trying to work out how these free article sites work. They allow people to put on their articles on their sites but these articles are on loads of different sites which means they are all duplicate content. I am finding it hard how google puts up with this and how the sites still continue to rank high in google when all their site is duplicate content. Can anyone please explain this to me.
Content Development | | ClaireH-1848860 -
Duplicate content across multiple e-commerce sites
Hi, This may have been discussed here before though I couldn't find a definitive answer, but I was wondering what the best practice for having multiple e-commerce sites with some of the same products on both is. If Site A has 200 products, and Site B has 300 products, and 50 overlap, do they absolutely have to have unique product descriptions? The descriptions have already been rewritten to be different from the manufacturer's description, but do they need to be changed again? Or should the 50 products only be listed on one site or the other? Sites like csnstores.com seem to have the same descriptions as the subsidiary sites that they have, but maybe they have different sidebars/footers/etc which make the content considered original?
Content Development | | nks20120 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230