Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using 410 To Remove URLs Starting With Same Word
-
We had a spam injection a few months ago. We successfully cleaned up the site and resubmitted to google. I recently received a notification showing a spike in 404 errors.
All of the URLS have a common word at the beginning injected via the spam:
sitename.com/mono
sitename.com/mono.php?buy-good-essays
sitename.com/mono.php?professional-paper-writerThere's about 100 total URLS with the same syntax with the word "mono" in them. Based on my research, it seems that it would be best to serve a 410. I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/
-
Martijn -
Thanks for your reply. I tried the code you provided, however it still provided a 404 error. I was able to get the following to work properly - any drawbacks to doing it this way?
RewriteRule ^mono(.*)$ - [NC,R=410,L]
The browser now shows the following anytime there is the word "mono" immediately after "sitename.com/"
The requested resource
/mono.php
is no longer available on this server and there is no forwarding address. Please remove all references to this resource.Additionally, a 410 Gone error was encountered while trying to use an ErrorDocument to handle the request.
-
Thanks for the detailed response. Yes, there are some negative-SEO backlinks to some of the URLs created during the spam injection. I've seen a few backlinks from other forum sites to our site to one of the spam created URLs which has hurt our rankings such as the following URL created on our site:
sitename.com/mono.php?best-resume-writing-service-for-it-professionals
I was confused by the following in your response: "If you can serve the 410s on a custom 410 page which also gives the Meta no-index directive, that will be a very strong signal to Google indeed that those aren't proper pages or fit for indexation"
- Is that all done view the htaccess file? Code? Or is the meta no-index directive done in the robots.txt?- custom 410 page? I've seen some 404 pages, but not custom 410 pages. Would that be similar to a new 404 page?
Thanks for your response.
-
There are so many ways to deal with this. If these were indeed spam URLs, someone may have attached negative-SEO links to them (to water down your site's ranking power). As such, redirecting these URLs back to their parents could pull spam metrics 'onto' your site which would be really bad. I can see why you are thinking about using 410 (gone)
Using Canonical tags to stop Google from indexing those bad parameter-based URLs could also be helpful. If you 'canonicalled' those addresses to their non-parameter based parents, Google would stop crawling those pages. When a URL 'canonicals' to another, different page - it cites itself as non-canonical, and thus gets de-indexed (usually, although this is only a directive). Again though, canonical tags interrelate pages. If those spam URLs were backed by negative SEO attacks, the usage of canonical tags would (again) be highly inadvisable (leaving your 410 suggestion as a better method).
Google listens for wildcard rules in your robots.txt file, though it runs very simplified regex (in fact I think only the "*" wildcard is supported). In your robots.txt you could do something like:
User-agent: *
Disallow: /mono.php?*That would cull Google's crawling of most of those URLs, but not necessarily the indexation. This would be something to do after Google has swallowed most of the 410s and 'got the message'. You shouldn't start out with this, as if Google can't crawl those URLs - it won't see your 410s! Just remember this, so that when the issue is resolved you can smack this down and stop the attack from occurring again (or at least, it will be preemptively nullified)
Finally you have Meta "No-Index" tags. They don't stop Google from crawling a URL, but they will remove those URLs from Google's index. If you can serve the 410s on a custom 410 page which also gives the Meta no-index directive, that will be a very strong signal to Google indeed that those aren't proper pages or fit for indexation
So now we have a bit of an action plan:
- 410 the bad URLs alongside a Meta no-index directive served from the same URL
- Once Google has swallowed all that (may be some weeks or just over 1 month), back-plate it with robots.txt wildcards
With regards to your oriignal question (sorry I took so long to get here) I'd use something like:
Redirect 410 /mono.php?*
I think .htaccess swallows proper regex (I think). The back slashes say "whatever character follows me, treat that character as a value and do not apply its general regex function". It's the regex escape character (usually). This would go in the .htaccess file at the root of your site, not in a subdir .htaccess file
Please sandbox text my recommendation first. I'm really more of a technical data analyst than a developer!
This document seems to suggest that a .htaccess file will properly swallow "" as the escape character:
https://premium.wpmudev.org/forums/topic/htaccess-redirects-with-special-characters
Hope this helps!
-
Hi,
Have you also excluded these pages from the robots.txt file so you can make sure that they're also not being crawled?
The code for the redirect looks something like this:RewriteEngine on
RewriteRule ^/mono* - [G,NC]Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Vanity URLs are being indexed in Google
We are currently using vanity URLs to track offline marketing, the vanity URL is structured as www.clientdomain.com/publication, this URL then is 302 redirected to the actual URL on the website not a custom landing page. The resulting redirected URL looks like: www.clientdomain.com/xyzpage?utm_source=print&utm_medium=print&utm_campaign=printcampaign. We have started to notice that some of the vanity URLs are being indexed in Google search. To prevent this from happening should we be using a 301 redirect instead of a 302 and will the Google index ignore the utm parameters in the URL that is being 301 redirect to? If not, any suggestions on how to handle? Thanks,
Technical SEO | | seogirl221 -
URL path randomly changing
Hi eveyone, got a quick question about URL structures: I'm currently working in ecommerce with a site that has hundreds of products that can be accessed through different URL paths: 1)www.domain.com/productx 2)www.domain.com/category/productx 3)www.domain.com/category/subcategory/productx 4)www.domain.com/bestsellers/productx 5)... In order to get rid of dublicate content issues, the canoncial tag has been installed on all the pages required. The problem I'm witnessing now is the following: If a visitor comes to the site and navigates to the product through example 2) at time the URL shown in the URL browser box is example 4), sometimes example 1) or whatever. So it is constantly changing. Does anyone know, why this happens and if it has any impact on GA tracking or even on SEO peformance. Any reply is much appreciated Thanks you
Technical SEO | | ennovators0 -
URL - Well Formed or Malformed
Hi Mozzers, I've been mulling over whether my URLs could benefit a little SEO tweaking. I'd be grateful for your opinion. For instance, we've a product, a vintage (second hand), red Chanel bag. At the moment the URL is: www.vintageheirloom.com/vintage-chanel-bags/2.55-bags/red-2.55-classic-double-flap-bag-1362483150 Broken down... vintage-chanel-bags = this is the main product category, i.e. vintage chanel bags 2.55-bags = is a sub category of the main category above. They are vintage Chanel 2.55 bags, but I've not included 'vintage' again. 2.55 bags are a type of Chanel bag. red-2.55-classic-double-flap-bag = this is the product, the bag **1362483150 **= this is a unique id, to prevent the possibility of duplicate URLs As you no doubt can see we target, in particular, the phrase **vintage. **The actual bag / product title is: Vintage Chanel Red 2.55 classic double flap bag 10” / 25cm With this in mind, would I be better off trying to match the product name with the end of the URL as closely as possible? So a close match below would involve not repeating 'chanel' again: www.vintageheirloom.com/chanel-bags/2.55-bags/vintage-red-2.55-classic-double-flap-bag or an exact match below would involve repeating 'chanel': www.vintageheirloom.com/chanel-bags/2.55-bags/vintage-chanel-red-2.55-classic-double-flap-bag This may open up more flexibility to experiment with product terms like second hand, preowned etc. Maybe this is a bad idea as I'm removing the phrase 'vintage' from the main category. But this logical extension of this looks like keyword stuffing !! www.vintageheirloom.com/vintage-chanel-bags/vintage-2.55-bags/vintage-chanel-red-2.55-classic-double-flap-bag Maybe this is over analyzing, but I doubt it? Thanks for looking. Kevin
Technical SEO | | well-its-1-louder0 -
Approved Word Separators in URLs
Hi There, We are in the process of revamping our URL structure and my devs tell me they have a technical problem using a hyphen as a word separator. There's a whole lot of competing recommendations out there and at this point I'm just confused. Does anyone have any idea what character would be next-best to the hyphen for separating words in a URL? Any reason to prefer one over another? Some links I've found discussing the topic: This page says that "__Google has confirmed that the point (.), the comma (,) and the hyphen (-) are valid word separators in URL’s.": http://www.internetofficer.com/seo/google-word-separator/ This page suggests the plus (+) symbol would be best: http://labs.phurix.net/posts/word-separators-in-urls This guy says he's tested and there's a whole bunch of symbols that will work as word separators: http://www.webproguide.com/articles/Symbols-as-word-separators-a-look-inside-the-search-engine-logic/ I'm leaning towards the tilde (~) or the plus (+) sign. Usage would be like so: http://www.domain.com/shop/sterling~silver OR /shop/sterling+silver etc... Thanks in advance for your help!
Technical SEO | | Richline_Digital1 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
Urls with or without .html ending
Hello, Can anyone show me some authority info on wheher links are better with or without a .html ending? Thanks is advance
Technical SEO | | sesertin0