Salvaging links from WMT “Crawl Errors” list?
-
When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file.
But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with.
First, let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way:
RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.mydomain.com$
RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L]
But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these:
www.mydomain.com/pagename1.htmlsale www.mydomain.com/pagename2.htmlhttp:// www.mydomain.com/pagename3.html" www.mydomain.com/pagename4.html/
How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo?
Second, is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these:
www.webutation.net titlesaurus.com www.webstatsdomain.com www.ericksontribune.com www.addondashboard.com search.wiki.gov.cn www.mixeet.com dinasdesignsgraphics.com
Your help is greatly appreciated. Thanks!
Greg
-
Hi Gregory -
Yes, as Frederico mentions you do not have to put the rewrite cond. before every rewrite since it the htaccess is on your root its implied. You might need to do this if you creating multiple redirects for www to non-www etc.
Also Frederico is right - this isnt the best way to deal with these links, but I use a different solution. First I get a flat file of my inbound links using other tools as well as WMT, and then i run them through a test to ensure that the linking page still exist.
Then I go through the list and just remove the scraper / stats sites like webstatsdomain, alexa etc so that the list is more manageable. Then I decide which links are ok to keep (there's no real quick way to decide, and everyone has their own method). But the only links are "bad" would be ones that may violate Google's Webmaster Guidelines.
Your list should be quite small at this point, unless you had a bunch of links to a page that you subsequently moved or changed its URL. In that case, add the rewrite to htaccess. The remaining list you can simply contact the sites and notify them of the broken link and ask to have it fixed. This is the best case scenario (instead of having it go to a 404 or even a 301 redirect). If its a good link, its worth the effort.
Hope that helps!
-
Exactly.
Let's do some cleanup
To redirect everything domain.com/** to www.domain.com you need this:
RewriteCond %{HTTP_HOST} !=www.domain.com [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]That's it for the www and non-www redirection.
Then, you only need one line per 301 redirection you want to do, without the need of specifying those rewrite conds you had previously, doing it like this:
RewriteRule ^pagename1.html(.*)$ pagename1.html [R=301,L]
That will in fact redirect any www/non-www page like pagename1.htmlhgjdfh to www.domain.com/pagename1.html. The (.*) acts as a wildcard.
You also don't need to type the domain as you did in your examples. You just type the page (as it is in your same domain, you don't need to specify it): pagename1.html
-
Thank you Federico. I did not know about the ability to use (.*)$ to deal with any junk stuck to the end of html
So when you said "the rewrite conds are not needed" do you mean that instead of creating three lines of code for each 301 redirect, like this...
RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.mydomain.com$
RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L]
...that the first two lines can be removed? So each 301 redirect rules is just one line like this...
RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L]
...without causing problems if the visitor is coming into the mydomain.com version or the www.mydomain.com version?
If so, that will sure help decrease the size of the file. But I thought that if we are directing everything to the www version, that those first two lines were needed.
Thanks again!
-
Well, if you still want to go that way, the rewrite conds there are not needed (as it is given that the htaccess IS in your domain). Then a rewrite rule for www.mydomain.com/pagename1.htmlsale should be:
RewriteRule ^pagename1.htmlsale$ pagename1.html [R=301,L]
Plus a rule to cover everything that is pagename1.html*** such as pagename1.html123, pagename1.html%22, etc. can be redirected with this rule:
RewriteRule ^pagename1.html(.*)$ pagename1.html [R=301,L]
-
Thanks Federico, I do have a good custom 404 page set up to help those who click a link with a typo.
But I still would like to know how to solve the questions asked above...
-
Although you can redirect any URL to the one you consider they wanted to link, you may end up with hundreds of rules in your htaccess.
I personally wouldn't use this approach, instead, you can build a really good 404 page, which will look into the typed URL and show a list of possible pages that the user was actually trying to reach, while still returning a 404 as the typed URL actually doesn't exists.
By using the above method you also avoid worrying about those links as you mentioned. No linkjuice is passed tho, but still traffic coming from those links will probably get the content they were looking for as your 404 page will list the possible URLs they were trying to reach...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site's IP showing WMT 'Links to My Site'
I have been going through, disavowing spam links in WMT and one of my biggest referral sources is our own IP address. Site: Covers.com
Technical SEO | | evansluke
IP: 208.68.0.72 We have recently fixed a number of 302 redirects, but the number of links actually seems to be increasing. Is this something I should ignore / disavow / fix using a redirect?0 -
My site is not being regularly crawled?
My site used to be crawled regularly, but not anymore. My pages aren't showing up in the index months after they've been up. I've added them to the sitemap and everything. I now have to submit them through webmaster tools to get them to index. And then they don't really rank? Before you go spouting off the standard SEO resolutions... Yes, I checked for crawl errors on Google Webmaster and no, there aren't any issues No, the pages are not noindex. These pages are index,follow No, the pages are not canonical No, the robots.txt does not block any of these pages No, there is nothing funky going on in my .htaccess. The pages load fine No, I don't have any URL parameters set What else would be interfereing? Here is one of the URLs that wasn't crawled for over a month: http://www.howlatthemoon.com/locations/location-st-louis
Technical SEO | | howlusa0 -
Feefo review links
Hi guys, so we are on feefo and noticed links coming in per review for different anchor text, this will be done on mass due to the amount of reviews we will get - this is all natural but in SEO site-wide links are typically not. How if at all do you think Google will react to this?
Technical SEO | | pauledwards0 -
DMOZ listing
Hello, My website is not listed in DMOZ. Is DMOZ important? And how to submit website to DMOZ directory? Thank you!
Technical SEO | | JohnHuynh2 -
Site verification in WMT
Hello all, I have a site and I want to set a preferred domain but when I do it says I need to verify my site but it gives me no ideas how to do that. I know that normally you have to do it when you set the account up but I had an analytics account for this domain first then just logged on with those details and I was in with no verification process. Cheers
Technical SEO | | jwdl0 -
What if 404 Error not possible?
Hi Everyone, I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error. It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon. I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired). Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough? Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page? Many thanks in advance for your help, Best Regards, Daniel
Technical SEO | | te_c0 -
Internal links of my website is taken as inbound link ?
Hi, I was checking my links in Open Site Explorer (http://www.opensiteexplorer.org/links?site=www.bons-plans-vacances.fr) this morning and i came up with this: My main domain is taken as outbound links ...! This link : www.bons-plans-vacances.fr/ Anchor Text : (img alt)100% Bons Plans Voyages From this URL : www.bons-plans-vacances.fr/ I have the same problem with my subdomains : voyage.bons-plans-vacances.fr/sejour/Toutes-Destinations I have that HTML code on the link : Any help ? This is very strange .. i have the same result in google webmaster tools. Thanks 🙂 eDE9b.jpg
Technical SEO | | BonsPlansvacances0 -
How is my competition causing bad crawl errors and links on my site
We have a compeditor who we are in a legal dispute at the moment, and they are using under hand tactics to cause us to have bad links and crawl errors and i do not know how they are doing it or how to stop it. The crawl errors we are getting is the site having two urls together, for example www.testsite.com/www.testsite.com and other errors are pages that we do not even have or pages that are spelt wrong or have a dot after the page name. We have been told off a number of people in our field that this has also happened to them and i would like to know how they are doing it so we can have this stopped Since they have been doing this our traffic has gone down by half
Technical SEO | | ClaireH-1848860