URL Index Removal for Hacked Website - Will this help?
-
My main question is: How do we remove URLs (links) from Google's index and the 1000s of created 404 errors associated with them after a website was hacked (and now fixed)?
The story: A customer came to us for a new website and some SEO. They had an existing website that had been hacked and their previous vendor was non-responsive to address the issue for months. This created THOUSANDS of URLs on their website that were then linked to pornographic and prescription med SPAM sites. Now, Google has 1,205 pages indexed that create 404 errors on the new site. I am confident these links are causing Google to not rank well organically.
Additional information:
- Entirely new website
- Wordpress site
- New host
Should we be using the "Remove URLs" tool from Google to submit all 1205 of these pages? Do you think it will make a difference? This is down from the 22,500 URLs that existed when we started a few months back. Thank you in advance for any tips or suggestions!
-
Yes.
Disavow needed for each site (http/https).
-
Thanks for clearing this out.
If i have spammy links on http version, but my site is now https, i should upload the same disavow list on both http and https? (i saw one answer of yours in other thread saying just that , and i think is important because many of us are missing this detail) -
If they are not your - it's better to disavow them. If they are spammy - disavow them.
Those links may hurt your ranking.
-
Hi Pete, something in your answer got my attention.
Like one month ago , i saw some (as was proven later) spammy links pointing to one specific page of my site. Those links ( from 20+ domains) were coming from some german domain names with the ltd .xyz extension.
Now the links don't actually exists, but those referring pages saying 410 Gone (nginx server).
Is that bad for that spesific page of mine?
I never saw in past this http status. -
If your "bad" link is like http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html then your .htaccess should be:
Redirect 410 /flibzy/foto-bugil-di-kelas.html
that's all.Yes - you should do this for ALL 1205 URLs. Don't do this on legal pages (before hacking), just on hacked pages. I say "gone" with 410 redirect. It's amazing. In your case gone for good. Time for identify that 1205 URLs and paste them into .htaccess is let's say X hours. Time for identify that 1205 URLs and temporary remove them is Y hours. Since "temporary removal" is up to 30 days this make same job each month. In total for one year you have X in first case and 12*Y in second case. You can see difference, right?
Also today Barry Adams release story about hacking:
http://www.stateofdigital.com/website-hacked-manual-penalty-google/
and it's amazing that site was hacked just for 4 hours but Google notice this. You can see there traffic drop and removal from SERP. Ok, i'm not trying to "fear sells", but keeping bad pages with 404 will take long time. In Jan-Feb 2012 i have new temporary site on mine site within /us/ folder and even today Jan 2016 i still receiving bots crawling this folder. That's why i nuke it with 410. This save the day!On your case it's same. Bot is wasting time and resources to crawl 404 pages over and over but crawling less your important pages. That's why it's good to nuke them. ONLY them. This will save bot crawling budget on your website. So bot can focus on your pages.
-
Hi Peter,
Thank you for your response! I saw you answered a similar question about a week ago, so thank you for weighing in on my options. So, to clarify, I must do this for all 1,205 of the URLs?
One SPAM link is pointing here: http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html so in your above example, this would look like:
Redirect 410 /dir/http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html/ (?) and do this for each page that Google has indexed?
I saw your example with the iphone on the other post. How did you get that page to say, GONE - The requested resource...
-
The best is to keep them 404. But fast is to 410 them.
All you need is to place this topmost somewhere of .htaccess:
Redirect 410 /dir/url1/
Redirect 410 /dir/url2/
Redirect 410 /dir1/url3/
Redirect 410 /dir1/url4/But this won't help you if your URLs have parameters somewhere like index.php?spamword1-blah-blah. For this you need extended version like this:
RewriteEngine on
#RewriteBase /
RewriteCond %{QUERY_STRING} spamword
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword1
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword2
RewriteRule ^(.*)$ /404.html? [R=410,L]So why 410? 410 act much faster than 404 but it's DANGEROUS! If you sent 410 to normal URL this is effective nuking it. I found that with 410 bot visit this url 1-2-3 times, but with 404 bot keep visiting over and over eating your crawling budget. URL removal in SearchConsole is OK, but it's fast but works only for 30 days. And will eat almost same time as building list for 404/410s. Hint: You can speedup crawling if you do "fetch and render" then submit to index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
Google indexed wrong pages of my website.
When I google site:www.ayurjeewan.com, after 8 pages, google shows Slider and shop pages. Which I don't want to be indexed. How can I get rid of these pages?
Intermediate & Advanced SEO | | bondhoward0 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
If you remove a 301-re-direct, will there be a corresponding drop in traffic?
We built a better version of a search results page and re-directed from the old search results page to the landing page, and are seeing a huge uptick in traffic. Wondering if we remove the re-direct and 404 the original search results page if we'll see a drop in traffic. I ran the search results page through open site explorer and Google Webmaster tools, and there aren't many links, but the search results page used to see quite a bit of of traffic over the past couple of years.
Intermediate & Advanced SEO | | nicole.healthline0 -
Will an RSS feed help new product get indexed? How to create one for product?
Hi I've read that creating an RSS feed for one of our ecommerce sites will help the products get indexed faster. Currently it takes google 4-5 days to index our new products, we want to speed that up. Will an RSS feed of the new products we have help? How do you create an RSS feed for this? Our blog gets indexed within minutes, but our main website, 4 days. Help!
Intermediate & Advanced SEO | | xoffie0 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
Will links to a subdomain help it rank?
I have an affiliate subdomain on a larger company's domain. (For example I have: www.victor.company.com on www.company.com). Would working to attain backlinks to the subdomain help it rank or will I just be putting forth my effort and helping the domain rank?
Intermediate & Advanced SEO | | VictorVC0