How do I stop someone from hijacking my site?
-
I had lost a lot of search engine rankings & it almost put me out of business. I worked a lot on the site in April and in May my sales rebounded 37%+ almost back to where they typically are. I have blamed Panda and saw another Panda update in May and my traffic is declining again. Today I happened to decide to check what pages were being indexed and I also noticed that when I published an update this morning it said the page "<cite class="_Zd">www.cheaptubes.com/index2.asp</cite>" exists on the destination server but not in current site and it asked if I wanted to delete it. I said no at the time but am going to delete it right after this question. I went to google & put "site:cheaptubes.com" into the search. My website has about 50 pages but google has "About 2,160 results" @ 10X per page. The first four pages are mostly my pages but I noticed even on the first page, 3rd result was for Nike shoes and it isn't my site. Someone has hacked my site and put up over 21,000 pages! That must be the real reason behind my website problems, right? How do I stop this from happening again? Is this having the negative effect that I fear it is?
-
ok, thx for that. I will try it. I know the pages that need the 301 from the others. all the bad pages have index2.asp in the names. I have changed my ftp and login passwords
good to know the 404 don't count against me
-
I would definitely make sure not to 301 redirect any of the bad pages. You might get penalized for that.
They won't count the 404's against you, I would change them to a 410 error though for those pages. Then they will drop out of your GWT and the Google index quicker.
-
I went to GWT today and now that I deleted the 2 malicious pages, my site now had 1950 404 errors of which only 15 are legitimate that I need to 301. I'm not sure how to handle this now. I read that google doesn't count 404 or 410 against you but 1950 pages is a lot. I would like to deal with it. I thought about wiping out the site and uploading a new version of it but I think it will have the same errors. Any advice?
http://www.cheaptubes.com/: Increase in not found errors June 20, 2014
Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site's pages.
Recommended action
- Check the Crawl Errors page in Webmaster Tools.
- Check that pages on your site don't link to non-existent pages. (If another site lists a broken link to your site, Google may list that URL but you may not be able to fix the error.)
-
Thank you for your help
-
Interesting, it sounds like they had a long term plan. Good thing you found about it in time.
I think the plan sounds good. I would definitely get off front page, it has been out of development for a while so there are vulnerabilities that have not been patched in years.
You might look into a cms like Wordpress or concrete5 to make things easier for you with transitioning the site. Then you would have to only learn minimal html / css, and could focus more on the content.
Good luck to ya.
-
Thank you so much. I changed my main login & FTP passwords today. I found that they had changed my email address to info@chatfieldstatepark.org & the last name on my account was changed to Puraty. I suspect it might be networksolutions who got hacked but i don't know for sure. I updated and scanned my computer last night and it is virus/malware free.
I am on shared hosting. I need to get my site updated and off frontpage & then off network solutions. You helped me back in April when i was trying to get my rankings back, thx. I have an interim plan.
1- I will use filezilla to FTP and it shouldn't have the same vulnerabilities as frontpage as it can be updated.
2- My theme designer said the theme doesn't rely on FP extensions. With this in mind I plan to download an HTML editor and can hopefully do future editing from there.
I think this is my short path to getting off frontpage. Does that make sense? I do realize I need to get from HTML2 to HTML5 or something more modern but I'm tryimng to handle this while juggling my other responsibilites in my 80hr work week.
-
Good that the site is clean. What scurri and programs like that do is analyze the site (your real site, not the made up pages) for malicious code. That means all the public facing files should be intact. Also GWT is google webmaster tools, if you passed the scurri check you should be fine there.
Since it sounds like you are on shared or managed hosting I would send a support email to your host and let them know the issue. They might be able to see where someone got in and when it happened, it is worth a shot at least.
What platform are you running on your site? Is it a cms or a custom platform?
More than likely the reason that Moz never detected the pages is because it is a crawler. It starts with your home page and follows every linking page on your site, if the pages were "orphaned" as it sounds like, the crawler would never have picked them up.
-
sucuri says my site is clean
-
Thanks you for your response. I had thought of changing my FTP password but the hosting co's servers were down for maintenance. They blamed it all on me. I did delete the index2.asp page and all the links are broken. I had deleted pages in April when cleaning up the site & I think I deleted this then but didn't know it. I do have a restorable version and I will pursue this tomorrow as well.I will check about the malware, thx as I wouldn't know where to do that. I have been having email problems for some time. I have moz scan my site every week (started in April) and they never detected 10,000 extra pages. I suspect I deleted it in April & they got back in. Maybe I should start selling Nike? : ) IS GWT google webmaster tools?
-
Change all of your passwords is the first thing you should do. Then you need to examine the server logs to see how they got in. I would check the ftp access logs first. Hopefully you have logging turned on. Then with those logs I would search for lines that are not your ip address. If you are on a static ip and you have had the same ip for a while it should be a lot easier. You will be looking for the other ip address. If you cannot find that the server was accessed from another ip address through ftp, then the next option is to look at the code. There might be an exploit in your site that will allow for it. One thing I would do is look at the files that were added via ftp, they will hold a time stamp on them. You can try to cross reference that time and day with the ftp log. If there is at those times (remember your server might not be set to your time zone) then start looking through the site for a "connector" file. It would have been the first that they created, it is basically a bot file that can create files on your server. If you can find that, check that time stamp against the log.
If you have a restorable version of the site, I would consider doing it. I would also see if your site is label as having malware on it. You can use GWT and sucurri to do that.
As for possibility hurting your rankings, yes, definitely. I would get the issue cleaned up, see if there is a pattern for the files that you can redirect to a 404 page and also the robots.txt file as well. I would also check GWT and see if you have a penalty as well. But I would do all of this ASAP if I were you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
B2b services site, deep long tail services or flat and broad
Hi all, i have a small internet agency with services in several area's.
Content Development | | mdkay
Currently i structud my services in 2 different area's.
General services and specialized products with services underneath. like so:
domain.com/services
domain.com/services/service1
domain.com/services/service2
and so on. domain.com/product
domain.com/product/service1
domain.com/product/service2 domain.com/product2
domain.com/product2/service1
domain.com/product2/service2 But now im thinking of create loads of different services i provide,
there are some overlapping content of course, but all services are still individual.
But i can write a lot of unique content about all the services. But is it smarter to make just a few big pages with the services more bundled.
Or really long tail. Now i have around 15 services pages.
when i make it a lot deeper im talking about 40 pages. Currently my complete website is around 35 pages.0 -
At what point to stop comments on a blog? Do too many comments hurt the page?
I have a page that's ranking pretty well, and driving sales. That page is starting to get 10+ comments per day and is starting to get quite long. I was wondering if there is a point where I should disable the comments? My gut tells me that people interacting with the page, and Google seeing responses with the users SHOULD be a good thing not bad. But, then I think that a majority of the content of the page is no longer the article, but the comments. All the comments are good, non spammy and directly related to the topic. People just asking questions, etc. Good engagement, I should be happy right?
Content Development | | DemiGR0 -
The etiquette of reproducing someone else's content
Hello - Here is a scenario, representative of something that I just saw play out. Site A is a new blog about travel (as an example topic) Site B is an older, established blog about travel Site C is a new blog launched and owned by Site B that focuses on a particular travel niche (luxury travel, for example) Here is what happens next Site A writes an original piece of content Site C then republishes Site A's content, paraphrasing all of the text, but giving Site A credit with a link Site B (the established site) publishes a blurb about the article, directing readers with a link to "read more" on Site C. It credits Site A as the original author, but does not link to it. If you were able to follow that, here is what I would like to know. Did Site C do anything wrong by republishing a paraphrased version of Site A's content, even though it gave credit with a link? Did Site B do anything wrong by linking to Site C (which is for all intents and purposes the same website), but not linking to Site A (the original source)? My sense is that the established blog (Site B) is trying to get it's new publication (Site C) to outrank the original author (Site A) using its own content. In general though, I am curious to get some thoughts on this situation because it raises a few ethical questions that I am not sure about, namely: Is there anything wrong with publishing "spun" content, if it is done well and links back to the source? Is there anything wrong with linking to a republished version of an article on a sister website, rather than linking to the original article. Thanks
Content Development | | timsegraves1 -
Is there a way to set up a wordpress site so that the content is changed based on a location?
I have a site I am building that has information that shifts based on weather conditions and location. Is there a way to have information sort based on a users geographic location? I am trying to figure out the best way to do this.
Content Development | | Ron_McCabe0 -
Is it possible to have news from two sites going into one wordpress site
Hi, i would like to know if it is possible to write original content on news on one site and then have it appear on two other sites as an introduction. I am using wordpress so i am very knew to it as i normally use joomla. I am looking for a rss feed that will show the image and allow the intro articles to appear in a category of its own, for example, other news any help would be great
Content Development | | ClaireH-1848860 -
Network of small sites or one large site
Hi there, I am taking a group of mums through how to set up a blog and how to build up traffic to that blog. Should I suggest that we all blog on the same domain so www.dsdsds.com/myblogname or get them to set up their own domains. My aim is for everyone to support each other, share and link content so would it be more powerful if they were all on different domains or on the same one? Would love some insight as it is something I just can not get my head around this part. Thank you so very very much 🙂
Content Development | | vcasebourne0 -
Will a comment section on my site help with seo
I have never been a fan of comment pages such as the sun http://www.thesun.co.uk/sol/homepage/showbiz/tv/4783642/bruce-willis-appears-on-the-one-show-in-awkward-interview.html but i was told the other day that not only is it good for the reader but it is also good for seo and increase the number of times that google would visit the page and i would like to know if that is true. if you have added a comment section to your articles i would like to know if you have noticed any change.
Content Development | | ClaireH-1848860 -
Index pdf files but redirecto to site
Hi, One of our clients has tons of PDFs (manuals, etc.) and frequently gets good rankings for the direct PDF link. While we're happy about the PDFs attracting users' attention, we'd like to redirect them to the site where the original PDF link is published and avoid that people open the pdf directly. In short, we'd like to index the PDFs, but show to users the pdf link within a site - how should we proceed to do that? Thanks, GM
Content Development | | gmellak0