Are 301s advisable for low-traffic URL's?
-
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed?
In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered?
This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
-
If those pages are indexed by Google and Google returns them in SERPs then yes, they will 404. That is why you need to test the page first and do a header redirect 301 to either the category page or the home page.
Hope that was the This Answered My Question : )
-
Great feedback! I still just have 1 remaining question, though, which I've posted below Richard's comments. Thanks!
-
The trademark issue is with the names of the subfolders, not the domain name.
-
So can you just change the links to look at the new URL? Still best to redirect them though.
Curious about why you have to change them now though as I just assumed you were using a competitors trademark in a domain before
-
Thanks for that tool! I was not familiar with it.
-
This almost fully answers my question. Those pages don't have inbound links from other sites. We have over 10,000 pages on the site, so we can't have links to them all. So, they aren't worth keeping for traffic or links.
But you say, "I would hope that you capture your 404 errors and 301 redirect all the time anyway." So, my last remaining question is: Am I necessarily creating 404 errors by not redirecting?
Thanks, everyone!
-
Yes, these are just pages on our main site. They will be renamed, and we will be keeping the content on the site.
-
If I'm reading this right though, it is only the URLs they've got to stop using, not the content. Therefore a 404 provide alternate content suggestions isn't necessary in this case; I agree that a 301 redirect is best solution - it passes the human traffic and the link juice to the correct location.
As to whether it is worth the cost, then of course it is the famous answer of "it depends". However, I'd imagine that the cost of redirects should be pretty minimal and if the old URLs drive just a couple of conversions (whatever that may be) then it should have been worthwhile, even ignoring the link juice.
-
As Ryan was stating; if those pages have inbound links, test those links for strength and if they are worth keeping, then 301.
Either way, I would hope that you capture your 404 errors and 301 redirect all the time anyway.
-
Sites put up and take down pages all the time. Broken links are of no consequence to the overall site quality.
This is a different discussion altogether, but broken URL situations actually offer an opportunity for a 404 page that offers users alternate content.
-
Are you linking out to these sites you have to get rid of?
In fact are they even sites or just other pages on your main site? I have maybe misunderstood
EDIT - I'll go ahead and assume I've just got the wrong end of the stick and it's pages on your site that you need to get rid of.
In that case if you can't redirect them can you change the links to point to different pages or even just remove them?
-
Thanks for this reply, and for the others!
OK, so the fact that your site has broken URLs doesn't bring your site in general down in the search engine rankings? Broken URLs aren't necessarily an indicator of a poor quality site that would result in some sort of penalty?
-
Redirecting them won't help the main domain rank for these brand terms, but it will capture the type in traffic and pass most of the link juice coming into these other sites.
Ultimately it shouldn't take your web development company long (unless you have hundreds) and indeed you could maybe even do it at the registrar easily (if not efficiently), so don't pay through the nose for it.
On the other hand, unless you rely on links from those other sites it won't harm your main site in any way by letting them die.
-
There are two things I would look closely at in such a situation...
Traffic: First, you want to know if these pages are generating any traffic. If they are, you should keep them. If they aren't (which it sounds like they aren't), move on to checking links...
Links: Before you scrap pages generating little inbound traffic, you should check to see if said pages have any inbound links. If they do, you would want to evaluate the quality of those links and determine if that is greater or lessor than the cost of keeping the pages and setting up redirects. If you determine these pages have valuable links, definitely 301 redirect them to a good substitute page.
When I speak of the cost associted with setting up the redirects I'm talking about the time taken to set up the redirects (likely your time or ITs time).
We use Open Site Explorer to help us audit inbound links to pages.
-
The link doesn't need to be broken. 301 redirect the existing link to the new one and anyone that is linking or typing or clicking into the old URL will be forwarded to the new one and they wont know it. Make sense? Yes, do it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT returning 200 for robots.txt, but it's actually returning a 404?
Hi, Just wondering if anyone has had this problem before. I'm just checking a client's GWT and I'm looking at their robots.txt file. In GWT, it's saying that it's all fine and returns a 200 code, but when I manually visit (or click the link in GWT) the page, it gives me a 404 error. As far as I can tell, the client has made no changes to the robots.txt recently, and we definitely haven't either. Has anyone had this problem before? Thanks!
Technical SEO | | White.net0 -
Merging sites, ensuring traffic doesn't die
Wondering if I could get a second opinion on this, please. I have just taken on a new client, they own about 6 different niched car experience websites (hire an Aston Martin for the day, type thing). All the six sites they have seem to perform reasonably well for the brand of car they deal with, the average DA of the sites is about 24. The client wishes to move all of these different manufacturers into one site and have sections of the site, they can then also target more generic experience day type keywords. The obvious way of dealing with this move would be to 301 the old sites to the relevant places on the new site and wait for that to rank. However, looking at the backlinks profile of the niched sites, they seem to have very few backlinks and i feel the reason they are ranking so well for all the individual manufacturers is because they all feature the name in the domain. Not exact match, but the name is there. If I am thinking right, with the 301 we want to tell Google page x is now page y, index this one instead. Because the new site has a more generic name I don't think it will enjoy any of the domain keyword benefits which are helping the sub sites, and as a result I expect the rankings and traffic to drop (at least in the short term). Am I reading this correct. Would people use a 301 in this case? The easiest thing to do would be to leave the 6 sub sites up and running on their own domain and launch the new site to run alongside them, however the client doesn't want this. Thanks, Carl
Technical SEO | | GrumpyCarl0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
No Keyword in URL
SEOMoz (and other platforms) advise that I need to add my keyword to the page URL, however as far as I'm concerned it has been, so why don't these platforms see it. My home page URL is www.salesandinternetmarketing.com, but apparently I haven't added the keyword internet marketing to the URL, what advice can you give me please? Lindsay
Technical SEO | | lindsayjhopkins1 -
Why am I not showing up in the SERP's or Google Local?
I have been trying to optimise the following site for both Google SERP's and Google Local - Pixel Primate The URL has been around for around 3 years now but they just updated the website and launched it in December 2012. I did the on-page optimisation early in January 2013 and Google seems to have indexed the changes, for the home page at least. One major keyword I am targeting for the home page is 'Web Design Leicester'. I understand that the DA is fairly low (24) so this is something I need to improve. However, I've experienced positive results fairly quickly from just on-page optimisation for other sites I have worked on. The site just doesn't seem to be ranking at all for any keywords. Maybe the industry type is just extremely competitve but I find it very strange to not be visible anywhere in the SERPs. The site does not seem to have any penalties as it ranks for 'Pixel Primate' and all pages appear when doing a site: search. Also what's strange is that I set up the Google Local listing years ago but it doesn't appear anywhere in the local listing, not even when I search for it manually. Any suggestions would be appreciated.
Technical SEO | | CWseo0 -
Structuring URL's for better SEO
Hello, We were rolling our fresh urls for our new service website. Currently we have our structure as www.practo.com/health/dental/clinic/bangalore We like to have it as www.practo.com/health/dental-clinic-bangalore Can someone advice us better which one of the above structure would work out better and why? Should this be a focus of attention while going ahead since this is like a search engine platform for patients looking out for actual doctors. Thanks, Aditya
Technical SEO | | shanky10 -
Moz Reporting Incorrect 404's
Hi Guys SEOMoz is telling me that we have 191 404 errors f. I have checked this with several other crawlers and this not the case. For example, http://www.opticalexpress.co.uk/eyecare/corporate-savings.html%0D%0A2027 But correct links its http://www.opticalexpress.co.uk/eyecare/corporate-savings.html which is fine... We have no record of these links so why is it appending these characters at the end of the URL which is causing the 404's....
Technical SEO | | EwanFisher0 -
301 an old URL with a ? in the URL?
I am redoing a site and the URL's are changing structure. The client's site was in magento and in the store they would get two URLs, for example: /store/categoryname/productname and /store/categoryname/productname?SID=dslkajsfdoiu947598whouieht983hg98 Do I have to 301 redirect both of these URL's to their new counterpart? Both go to the same content but magento seemed to add these SIDs into the navigation and Google has both versions in the index.
Technical SEO | | DanDeceuster0