My site got hacked and now i have 1000s of 404 pages and backlinks. Should i transfer my site to a new domain name and start again?
-
My site was hacked and I had 1000s of pages that should not exist created and has had 1000s of backlinks put in. Now i have the same pages and backlinks redirecting to 404 pages. Is this why my site crashed out of google and my SEO fixes since have made no progress to my problem?
-
Its not a big problem and there is no need to start on a new domain. In fact, this is common with WordPress websites that are not updated regularly.
I would advise the following:
-
Try to identify how did the hack take place - if its WordPress you can install the Sucuri Security (free) plugin that will help identify any compromised files.
-
Cleanup the website by deleting all the spam sub-folders & pages.
-
Update your WordPress, Theme and all Plugins (if this is a WordPress website).
-
You can have a quick fix for all the 404s by submitting a 410 response code for these URLs. This can be done through a number of free redirection plugins on WordPress. If its not a WordPress site, there are a number of other options to get this done too.
Hope that helps.
-
-
Are you unable to retrieve your site from your server? Is your domain or site hacked?
-
Its not a problem to have 404-Pages, when they are not linked internally. High amount of pages pointing to your site sounds like you may have to disavow stuff. If your page is known long enaugh for search engines, a new domain with equal content will make the problems move to the new domain (you may can see that happen for dejanseo switching without redirect to dejanmarketing).
Did Google saw the hack and asked you to solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which new domain extensions matter
Hello, Which new domain extensions matter to preserve future rankings? For example, .shop, .inc, .blog, .web, etc.
Industry News | | BobGW0 -
What's an example of an SEO firm's site that lists their SEO packages excellently
Hello, I'd like to make a packages page on the site I'm still building bobweikelseo.com What's an example of a firm or SEO that lists their packages in a way that is appealing, useful, and all around excellent? I'm thinking of having a first package for analysis at $440, a second package for analysis and on-site SEO for $880, and a third package for analysis, on-site SEO, and link building for $2000-$4000/month depending on how fast they want the link building to go. Looking for some input on how to create a packages page, if my prices are appropriate (I want to be affordable), and an example of a website that does an excellent job at presenting their packages.
Industry News | | BobGW0 -
We noticed that goods offered in our email newsletters used to disappeared from first google search results page!?
We noticed that goods offered in our email newsletters used to disappeared from fisrt google search results page. Goods where in top 5 positions or even higher, but after email newsletters we didn't find them even in top 100. We suspect service provider of email sending is in blacklist? Could it be reason? If yes, how could we check that?
Industry News | | Patogupirkti0 -
SEO Conferences - Which to start with!???
My SEO / Internet marketing business (I also have contractors that handle web design and development) is going well and growing and I am interested in attending one of the many conferences. (SEO is my passion but I am more of a marketing guy than super technical) I was hoping for a little bit of advice from somebody who has been to some of them where would be good to start. Where should I start? I am in Wichita, KS. which is in the middle of the US (bring on the yellow brick road and Dorothy jokes) and don't plan to leave the country for one. PUBCON, SES, SMX, SEARCHFEST, MOXCON, PUBCON...... Thank you very MUCH for any advice. Super appreciate it! Matthew
Industry News | | Mrupp440 -
Google Will Penalize Sites Repeatedly Accused Of Copyright Infringement
Has someone filed a large number of DMCA “takedown” requests against your site? If so, look out. That’s the latest penalty that may cause you to rank lower in Google’s search results. It joins other penalties such as “Panda” and “Penguin.” We’re dubbing it the “Emanuel Update” in honor of Hollywood mogul Ari Emanuel, who helped prompt it. Read more here: http://searchengineland.com/dmca-requests-now-used-in-googles-ranking-algorithm-130118 What do you guys think MOZERS?
Industry News | | Chenzo0 -
"Links To Your Site" In Webmaster Tools
How often does Google update the "links to your site" data. It seems that it has been static for about a month now even though we have made a lot of changes Does anyone have any idea? If you have made changes to your links (i.e removed links, updated anchor text, etc.), do you have to wait for this information to be updated to measure the impact? Or is that whenever Google crawls those pages/sites and sees changes there is a adjustment. Thanks
Industry News | | inhouseseo0 -
Site search
My google site search is now costing us $750/year. That is an outrageous fee for providing a search feature on our website. does anyone know of an effective search box we can put on our site that is less expensive than google?
Industry News | | StreetwiseReports0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690