Are 301s advisable for low-traffic URL's?
-
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed?
In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered?
This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
-
If those pages are indexed by Google and Google returns them in SERPs then yes, they will 404. That is why you need to test the page first and do a header redirect 301 to either the category page or the home page.
Hope that was the This Answered My Question : )
-
Great feedback! I still just have 1 remaining question, though, which I've posted below Richard's comments. Thanks!
-
The trademark issue is with the names of the subfolders, not the domain name.
-
So can you just change the links to look at the new URL? Still best to redirect them though.
Curious about why you have to change them now though as I just assumed you were using a competitors trademark in a domain before
-
Thanks for that tool! I was not familiar with it.
-
This almost fully answers my question. Those pages don't have inbound links from other sites. We have over 10,000 pages on the site, so we can't have links to them all. So, they aren't worth keeping for traffic or links.
But you say, "I would hope that you capture your 404 errors and 301 redirect all the time anyway." So, my last remaining question is: Am I necessarily creating 404 errors by not redirecting?
Thanks, everyone!
-
Yes, these are just pages on our main site. They will be renamed, and we will be keeping the content on the site.
-
If I'm reading this right though, it is only the URLs they've got to stop using, not the content. Therefore a 404 provide alternate content suggestions isn't necessary in this case; I agree that a 301 redirect is best solution - it passes the human traffic and the link juice to the correct location.
As to whether it is worth the cost, then of course it is the famous answer of "it depends". However, I'd imagine that the cost of redirects should be pretty minimal and if the old URLs drive just a couple of conversions (whatever that may be) then it should have been worthwhile, even ignoring the link juice.
-
As Ryan was stating; if those pages have inbound links, test those links for strength and if they are worth keeping, then 301.
Either way, I would hope that you capture your 404 errors and 301 redirect all the time anyway.
-
Sites put up and take down pages all the time. Broken links are of no consequence to the overall site quality.
This is a different discussion altogether, but broken URL situations actually offer an opportunity for a 404 page that offers users alternate content.
-
Are you linking out to these sites you have to get rid of?
In fact are they even sites or just other pages on your main site? I have maybe misunderstood
EDIT - I'll go ahead and assume I've just got the wrong end of the stick and it's pages on your site that you need to get rid of.
In that case if you can't redirect them can you change the links to point to different pages or even just remove them?
-
Thanks for this reply, and for the others!
OK, so the fact that your site has broken URLs doesn't bring your site in general down in the search engine rankings? Broken URLs aren't necessarily an indicator of a poor quality site that would result in some sort of penalty?
-
Redirecting them won't help the main domain rank for these brand terms, but it will capture the type in traffic and pass most of the link juice coming into these other sites.
Ultimately it shouldn't take your web development company long (unless you have hundreds) and indeed you could maybe even do it at the registrar easily (if not efficiently), so don't pay through the nose for it.
On the other hand, unless you rely on links from those other sites it won't harm your main site in any way by letting them die.
-
There are two things I would look closely at in such a situation...
Traffic: First, you want to know if these pages are generating any traffic. If they are, you should keep them. If they aren't (which it sounds like they aren't), move on to checking links...
Links: Before you scrap pages generating little inbound traffic, you should check to see if said pages have any inbound links. If they do, you would want to evaluate the quality of those links and determine if that is greater or lessor than the cost of keeping the pages and setting up redirects. If you determine these pages have valuable links, definitely 301 redirect them to a good substitute page.
When I speak of the cost associted with setting up the redirects I'm talking about the time taken to set up the redirects (likely your time or ITs time).
We use Open Site Explorer to help us audit inbound links to pages.
-
The link doesn't need to be broken. 301 redirect the existing link to the new one and anyone that is linking or typing or clicking into the old URL will be forwarded to the new one and they wont know it. Make sense? Yes, do it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our homepage url has been 301'd to the new https version - as our MD wanted us to have the secure protocol
Hello Mozers I'm just checking whether it is good practice to 301 the main homepage url to its https version. Will this have any detrimental effect on ranking and DA?
Technical SEO | | Catherine_Selectaglaze0 -
"Yet-to-be-translated" Duplicate Content: is rel='canonical' the answer?
Hi All, We have a partially internationalized site, some pages are translated while others have yet to be translated. Right now, when a page has not yet been translated we add an English-language page at the url https://our-website/:language/page-name and add a bar for users to the top of the page that simply says "Sorry, this page has not yet been translated". This is best for our users, but unfortunately it creates duplicate content, as we re-publish our English-language content a second time under a different url. When we have untranslated (i.e. duplicate) content I believe the best thing we can do is add which points to the English page. However here's my concern: someday we _will_translate/localize these pages, and therefore someday these links will _not _have duplicate content. I'm concerned that a long time of having rel='canonical' on these urls, if we suddenly change this, that these "recently translated, no longer pointing to cannonical='english' pages" will not be indexed properly. Is this a valid concern?
Technical SEO | | VectrLabs0 -
Sitemap url's not being indexed
There is an issue on one of our sites regarding many of the sitemap url's not being indexed. (at least 70% is not being indexed) The url's in the sitemap are normal url's without any strange characters attached to them, but after looking into it, it seems a lot of the url's get a #. + a number sequence attached to them once you actually go to that url. We are not sure if the "addthis" bookmark could cause this, or if it's another script doing it. For example Url in the sitemap: http://example.com/example-category/0246 Url once you actually go to that link: http://example.com/example-category/0246#.VR5a Just for further information, the XML file does not have any style information associated with it and is in it's most basic form. Has anyone had similar issues with their sitemap not being indexed properly ?...Could this be the cause of many of these url's not being indexed ? Thanks all for your help.
Technical SEO | | GreenStone0 -
Which URL would you choose?
1 – www.company.com/subfolder/subfolder/keyword-keyword-product (I’m able to keyword match with this url) or 2. www.company.com/subfolder/subfolder/product (no url keyword match) What would you choose? A url which is "short" but still relevant, or, a url which is more descriptive allowing “keyword” match? Be great to get your feedback guys. Many thanks Gary
Technical SEO | | GaryVictory0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
What's best practice for blog meta titles?
I have the option of placing meta titles on the actual blog, or on the blog category on my site. Should I have separate meta titles for each blog or bundle them under a category and try to drive traffic to the category? Can anyone help with best practice?
Technical SEO | | Lubeman0 -
What's the best URL Structure if my company is in multiple locations or cities?
I have read numerous intelligent, well informed responses to this question but have yet to hear a definitive answer from an authority. Here's the situation. Let's say I have a company who's URL is www.awesomecompany.com who provides one service called 'Awesome Service' This company has 20 franchises in the 20 largest US cities. They want a uniform online presence, meaning they want their design to remain consistent across all 20 domains. My question is this; what's the best domain or url structure for these 20 sites? Subdomain - dallas.awesomecompany.co Unique URL - www.dallasawesomecompany.com Directory - www.awesomecompany.com/dallas/ Here's my thoughts on this question but I'm really hoping someone b*tch slaps me and tells me I'm wrong: Of these three potential solutions these are how I would rank them and why: Subdomains Pros: Allows me to build an entire site so if my local site grows to 50+ pages, it's still easy to navigate Allows me to brand root domain and leverage brand trust of root domain (let's say the franchise is starbucks.com for instance) Cons: This subdomain is basically a brand new url in google's eyes and any link building will not benefit root domain. Directory Pros Fully leverages the root domain branding and fully allows for further branding If the domain is an authority site, ranking for sub pages will be achieved much quicker Cons While this is a great solution if you just want a simple map listing and contact info page for each of your 20 locations, what if each location want's their own "about us" page and their own "Awesome Service" page optimized for their respective City (i.e. Awesome Service in Dallas)? The Navigation and potentially the URL is going to start to get really confusing and cumbersome for the end user. Think about it, which is preferable?: dallas.awesomcompany.com/awesome-service/ www.awesomecompany.com/dallas/awesome-service (especially when www.awesomecompany.com/awesome-service/ already exists Unique URL Pros Potentially quicker rankings achieved than a subdomain if it's an exact match domain name (i.e. dallasawesomeservice.com) Cons Does not leverage the www.awesomecompany.com brand Could look like an imposter It is literally a brand new domain in Google's eyes so all SEO efforts would start from scratch Obviously what goes without saying is that all of these domains would need to have unique content on them to avoid duplicate content penalties. I'm very curious to hear what you all have to say.
Technical SEO | | BrianJGomez0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0