Are 301s advisable for low-traffic URL's?
-
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed?
In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered?
This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
-
If those pages are indexed by Google and Google returns them in SERPs then yes, they will 404. That is why you need to test the page first and do a header redirect 301 to either the category page or the home page.
Hope that was the This Answered My Question : )
-
Great feedback! I still just have 1 remaining question, though, which I've posted below Richard's comments. Thanks!
-
The trademark issue is with the names of the subfolders, not the domain name.
-
So can you just change the links to look at the new URL? Still best to redirect them though.
Curious about why you have to change them now though as I just assumed you were using a competitors trademark in a domain before
-
Thanks for that tool! I was not familiar with it.
-
This almost fully answers my question. Those pages don't have inbound links from other sites. We have over 10,000 pages on the site, so we can't have links to them all. So, they aren't worth keeping for traffic or links.
But you say, "I would hope that you capture your 404 errors and 301 redirect all the time anyway." So, my last remaining question is: Am I necessarily creating 404 errors by not redirecting?
Thanks, everyone!
-
Yes, these are just pages on our main site. They will be renamed, and we will be keeping the content on the site.
-
If I'm reading this right though, it is only the URLs they've got to stop using, not the content. Therefore a 404 provide alternate content suggestions isn't necessary in this case; I agree that a 301 redirect is best solution - it passes the human traffic and the link juice to the correct location.
As to whether it is worth the cost, then of course it is the famous answer of "it depends". However, I'd imagine that the cost of redirects should be pretty minimal and if the old URLs drive just a couple of conversions (whatever that may be) then it should have been worthwhile, even ignoring the link juice.
-
As Ryan was stating; if those pages have inbound links, test those links for strength and if they are worth keeping, then 301.
Either way, I would hope that you capture your 404 errors and 301 redirect all the time anyway.
-
Sites put up and take down pages all the time. Broken links are of no consequence to the overall site quality.
This is a different discussion altogether, but broken URL situations actually offer an opportunity for a 404 page that offers users alternate content.
-
Are you linking out to these sites you have to get rid of?
In fact are they even sites or just other pages on your main site? I have maybe misunderstood
EDIT - I'll go ahead and assume I've just got the wrong end of the stick and it's pages on your site that you need to get rid of.
In that case if you can't redirect them can you change the links to point to different pages or even just remove them?
-
Thanks for this reply, and for the others!
OK, so the fact that your site has broken URLs doesn't bring your site in general down in the search engine rankings? Broken URLs aren't necessarily an indicator of a poor quality site that would result in some sort of penalty?
-
Redirecting them won't help the main domain rank for these brand terms, but it will capture the type in traffic and pass most of the link juice coming into these other sites.
Ultimately it shouldn't take your web development company long (unless you have hundreds) and indeed you could maybe even do it at the registrar easily (if not efficiently), so don't pay through the nose for it.
On the other hand, unless you rely on links from those other sites it won't harm your main site in any way by letting them die.
-
There are two things I would look closely at in such a situation...
Traffic: First, you want to know if these pages are generating any traffic. If they are, you should keep them. If they aren't (which it sounds like they aren't), move on to checking links...
Links: Before you scrap pages generating little inbound traffic, you should check to see if said pages have any inbound links. If they do, you would want to evaluate the quality of those links and determine if that is greater or lessor than the cost of keeping the pages and setting up redirects. If you determine these pages have valuable links, definitely 301 redirect them to a good substitute page.
When I speak of the cost associted with setting up the redirects I'm talking about the time taken to set up the redirects (likely your time or ITs time).
We use Open Site Explorer to help us audit inbound links to pages.
-
The link doesn't need to be broken. 301 redirect the existing link to the new one and anyone that is linking or typing or clicking into the old URL will be forwarded to the new one and they wont know it. Make sense? Yes, do it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Another company's website indexing for my site
Hi, I am looking at all the pages which Google are indexing for my website and have come across pages of another company's website. I have contacted them through their online form and Facebook page asking for them to remove their listings for us, but to no avail so far. Is there a way I can do this myself?
Technical SEO | | British-Car-Registrations0 -
Strange URL's indexed
Hi, I got the message "Increase in not found errors" (404 errors) in GWT for one of my website. I did not change anything but I now see a lot of "strange" URL's indexed (~50) : &ui=2&tf=1&shva=1 &cat_id=6&tag_id=31&Remark=In %22%3EAny suggestion on how to fix it ?Erwan
Technical SEO | | johnny1220 -
The word 'shop' in a page title
I'm reworking most of the page titles on our site and I'm considering the use of the word 'Shop' before a product category. ex. Shop 'keyword' | Brand Name As opposed to just using the keyword sans 'Shop.' Some of the keywords are very generic, especially for a top level category page. Question: Is the word 'Shop' damaging my SEO efforts in any way?
Technical SEO | | rhoadesjohn0 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
What's the best format for a e-commerce URL product page
We have over 2000 non branded experiences and activities sold through our website. The website is having a face lift with the a new look and a stronger focus on SEO. As part of this, I am keen to establish what the best practice is for product based URLs. I've researched the market and come up with a few alternatives that are used: domain/category/subcategory/activity_name domain/activity_name/category/subcategory/activity_reference domain/generic_term/activity_reference/activity_name domain/category/activity_location/activity_name Activities are location based but the location can change (say once every 2 years). Activity names, category, subcategory and activity_reference rarely change. Are there any thoughts/ research on the best method? (If there is one) Many thanks in advance for your insights.
Technical SEO | | philwill0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0