URL Index Removal for Hacked Website - Will this help?
-
My main question is: How do we remove URLs (links) from Google's index and the 1000s of created 404 errors associated with them after a website was hacked (and now fixed)?
The story: A customer came to us for a new website and some SEO. They had an existing website that had been hacked and their previous vendor was non-responsive to address the issue for months. This created THOUSANDS of URLs on their website that were then linked to pornographic and prescription med SPAM sites. Now, Google has 1,205 pages indexed that create 404 errors on the new site. I am confident these links are causing Google to not rank well organically.
Additional information:
- Entirely new website
- Wordpress site
- New host
Should we be using the "Remove URLs" tool from Google to submit all 1205 of these pages? Do you think it will make a difference? This is down from the 22,500 URLs that existed when we started a few months back. Thank you in advance for any tips or suggestions!
-
Yes.
Disavow needed for each site (http/https).
-
Thanks for clearing this out.
If i have spammy links on http version, but my site is now https, i should upload the same disavow list on both http and https? (i saw one answer of yours in other thread saying just that , and i think is important because many of us are missing this detail) -
If they are not your - it's better to disavow them. If they are spammy - disavow them.
Those links may hurt your ranking.
-
Hi Pete, something in your answer got my attention.
Like one month ago , i saw some (as was proven later) spammy links pointing to one specific page of my site. Those links ( from 20+ domains) were coming from some german domain names with the ltd .xyz extension.
Now the links don't actually exists, but those referring pages saying 410 Gone (nginx server).
Is that bad for that spesific page of mine?
I never saw in past this http status. -
If your "bad" link is like http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html then your .htaccess should be:
Redirect 410 /flibzy/foto-bugil-di-kelas.html
that's all.Yes - you should do this for ALL 1205 URLs. Don't do this on legal pages (before hacking), just on hacked pages. I say "gone" with 410 redirect. It's amazing. In your case gone for good. Time for identify that 1205 URLs and paste them into .htaccess is let's say X hours. Time for identify that 1205 URLs and temporary remove them is Y hours. Since "temporary removal" is up to 30 days this make same job each month. In total for one year you have X in first case and 12*Y in second case. You can see difference, right?
Also today Barry Adams release story about hacking:
http://www.stateofdigital.com/website-hacked-manual-penalty-google/
and it's amazing that site was hacked just for 4 hours but Google notice this. You can see there traffic drop and removal from SERP. Ok, i'm not trying to "fear sells", but keeping bad pages with 404 will take long time. In Jan-Feb 2012 i have new temporary site on mine site within /us/ folder and even today Jan 2016 i still receiving bots crawling this folder. That's why i nuke it with 410. This save the day!On your case it's same. Bot is wasting time and resources to crawl 404 pages over and over but crawling less your important pages. That's why it's good to nuke them. ONLY them. This will save bot crawling budget on your website. So bot can focus on your pages.
-
Hi Peter,
Thank you for your response! I saw you answered a similar question about a week ago, so thank you for weighing in on my options. So, to clarify, I must do this for all 1,205 of the URLs?
One SPAM link is pointing here: http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html so in your above example, this would look like:
Redirect 410 /dir/http://OURDOMAIN/flibzy/foto-bugil-di-kelas.html/ (?) and do this for each page that Google has indexed?
I saw your example with the iphone on the other post. How did you get that page to say, GONE - The requested resource...
-
The best is to keep them 404. But fast is to 410 them.
All you need is to place this topmost somewhere of .htaccess:
Redirect 410 /dir/url1/
Redirect 410 /dir/url2/
Redirect 410 /dir1/url3/
Redirect 410 /dir1/url4/But this won't help you if your URLs have parameters somewhere like index.php?spamword1-blah-blah. For this you need extended version like this:
RewriteEngine on
#RewriteBase /
RewriteCond %{QUERY_STRING} spamword
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword1
RewriteRule ^(.)$ /404.html? [R=410,L]
RewriteCond %{QUERY_STRING} spamword2
RewriteRule ^(.*)$ /404.html? [R=410,L]So why 410? 410 act much faster than 404 but it's DANGEROUS! If you sent 410 to normal URL this is effective nuking it. I found that with 410 bot visit this url 1-2-3 times, but with 404 bot keep visiting over and over eating your crawling budget. URL removal in SearchConsole is OK, but it's fast but works only for 30 days. And will eat almost same time as building list for 404/410s. Hint: You can speedup crawling if you do "fetch and render" then submit to index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website cache has removed
Hi Team, I am facing an issue with cache of the website, despite various r&d I couldn't able to find the solution as code seems to be ok to me. Can any one of you check and let me know why home page and some of the product pages removed from the caching. See here: https://bit.ly/2Kna3PD Appreciate a quick response! Thanks
Intermediate & Advanced SEO | | Devtechexpert0 -
Wrong URLs indexed, Failing To Rank Anywhere
I’m struggling with a client website that's massively failing to rank. It was published in Nov/Dec last year - not optimised or ranking for anything, it's about 20 pages. I came onboard recently, and 5-6 weeks ago we added new content, did the on-page and finally changed from the non-www to the www version in htaccess and WP settings (while setting www as preferred in Search Console). We then did a press release and since then, have acquired about 4 partial match contextual links on good websites (before this, it had virtually none, save for social profiles etc.) I should note that just before we added the (about 50%) new content and optimised, my developer accidentally published the dev site of the old version of the site and it got indexed. He immediately added it correctly to robots.txt, and I assumed it would therefore drop out of the index fairly quickly and we need not be concerned. Now it's about 6 weeks later, and we’re still not ranking anywhere for our chosen keywords. The keywords are around “egg freezing,” so only moderate competition. We’re not even ranking for our brand name, which is 4 words long and pretty unique. We were ranking in the top 30 for this until yesterday, but it was the press release page on the old (non-www) URL! I was convinced we must have a duplicate content issue after realising the dev site was still indexed, so last week, we went into Search Console to remove all of the dev URLs manually from the index. The next day, they were all removed, and we suddenly began ranking (~83) for “freezing your eggs,” one of our keywords! This seemed unlikely to be a coincidence, but once again, the positive sign was dampened by the fact it was non-www page that was ranking, which made me wonder why the non-www pages were still even indexed. When I do site:oursite.com, for example, both non-www and www URLs are still showing up…. Can someone with more experience than me tell me whether I need to give up on this site, or what I could do to find out if I do? I feel like I may be wasting the client’s money here by building links to a site that could be under a very weird penalty 😕
Intermediate & Advanced SEO | | Ullamalm0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Increasing index
Hi! I'm having some trouble getting Google to index pages which once had a querystring in them but now are being redirected with a 301. The pages have a lot of unique content but this doesn't seem to matter. I feels as if there stuck in limbo (or a sandbox 🙂 Any clues on how to fix this? Thanks / Niklas
Intermediate & Advanced SEO | | KAN-Malmo0 -
Will links to a subdomain help it rank?
I have an affiliate subdomain on a larger company's domain. (For example I have: www.victor.company.com on www.company.com). Would working to attain backlinks to the subdomain help it rank or will I just be putting forth my effort and helping the domain rank?
Intermediate & Advanced SEO | | VictorVC0 -
Multiple language websites help in pagerank?
My website is in portuguese. If I make a english version and include it at google.com, it will increase my overall page rank?
Intermediate & Advanced SEO | | Naghirniac0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Sitemap - % of URL's in Google Index?
What is the average % of links from a sitemap that are included in the Google index? Obviously want to aim for 100% of the sitemap urls to be indexed, is this realistic?
Intermediate & Advanced SEO | | stats440