Are 301s advisable for low-traffic URL's?
-
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed?
In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered?
This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
-
If those pages are indexed by Google and Google returns them in SERPs then yes, they will 404. That is why you need to test the page first and do a header redirect 301 to either the category page or the home page.
Hope that was the This Answered My Question : )
-
Great feedback! I still just have 1 remaining question, though, which I've posted below Richard's comments. Thanks!
-
The trademark issue is with the names of the subfolders, not the domain name.
-
So can you just change the links to look at the new URL? Still best to redirect them though.
Curious about why you have to change them now though as I just assumed you were using a competitors trademark in a domain before
-
Thanks for that tool! I was not familiar with it.
-
This almost fully answers my question. Those pages don't have inbound links from other sites. We have over 10,000 pages on the site, so we can't have links to them all. So, they aren't worth keeping for traffic or links.
But you say, "I would hope that you capture your 404 errors and 301 redirect all the time anyway." So, my last remaining question is: Am I necessarily creating 404 errors by not redirecting?
Thanks, everyone!
-
Yes, these are just pages on our main site. They will be renamed, and we will be keeping the content on the site.
-
If I'm reading this right though, it is only the URLs they've got to stop using, not the content. Therefore a 404 provide alternate content suggestions isn't necessary in this case; I agree that a 301 redirect is best solution - it passes the human traffic and the link juice to the correct location.
As to whether it is worth the cost, then of course it is the famous answer of "it depends". However, I'd imagine that the cost of redirects should be pretty minimal and if the old URLs drive just a couple of conversions (whatever that may be) then it should have been worthwhile, even ignoring the link juice.
-
As Ryan was stating; if those pages have inbound links, test those links for strength and if they are worth keeping, then 301.
Either way, I would hope that you capture your 404 errors and 301 redirect all the time anyway.
-
Sites put up and take down pages all the time. Broken links are of no consequence to the overall site quality.
This is a different discussion altogether, but broken URL situations actually offer an opportunity for a 404 page that offers users alternate content.
-
Are you linking out to these sites you have to get rid of?
In fact are they even sites or just other pages on your main site? I have maybe misunderstood
EDIT - I'll go ahead and assume I've just got the wrong end of the stick and it's pages on your site that you need to get rid of.
In that case if you can't redirect them can you change the links to point to different pages or even just remove them?
-
Thanks for this reply, and for the others!
OK, so the fact that your site has broken URLs doesn't bring your site in general down in the search engine rankings? Broken URLs aren't necessarily an indicator of a poor quality site that would result in some sort of penalty?
-
Redirecting them won't help the main domain rank for these brand terms, but it will capture the type in traffic and pass most of the link juice coming into these other sites.
Ultimately it shouldn't take your web development company long (unless you have hundreds) and indeed you could maybe even do it at the registrar easily (if not efficiently), so don't pay through the nose for it.
On the other hand, unless you rely on links from those other sites it won't harm your main site in any way by letting them die.
-
There are two things I would look closely at in such a situation...
Traffic: First, you want to know if these pages are generating any traffic. If they are, you should keep them. If they aren't (which it sounds like they aren't), move on to checking links...
Links: Before you scrap pages generating little inbound traffic, you should check to see if said pages have any inbound links. If they do, you would want to evaluate the quality of those links and determine if that is greater or lessor than the cost of keeping the pages and setting up redirects. If you determine these pages have valuable links, definitely 301 redirect them to a good substitute page.
When I speak of the cost associted with setting up the redirects I'm talking about the time taken to set up the redirects (likely your time or ITs time).
We use Open Site Explorer to help us audit inbound links to pages.
-
The link doesn't need to be broken. 301 redirect the existing link to the new one and anyone that is linking or typing or clicking into the old URL will be forwarded to the new one and they wont know it. Make sense? Yes, do it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website's Home Page is Missing on Google SERP
Hi All, I have a WordPress website which has about 10-12 pages in total. When I search for the brand name on Google Search, the home page URL isn't appearing on the result pages while the rest of the pages are appearing. There're no issues with the canonicalization or meta titles/descriptions as such. What could possibly the reason behind this aberration? Looking forward to your advice! Cheers
Technical SEO | | ugorayan0 -
Dealing with broken internal links/404s. What's best practice?
I've just started working on a website that has generated lots (100s) of broken internal links. Essentially specific pages have been removed over time and nobody has been keeping an eye on what internal links might have been affected. Most of these are internal links that are embedded in content which hasn't been updated following the page's deletion. What's my best way to approach fixing these broken links? My plan is currently to redirect where appropriate (from a specific service page that doesn't exist to the overall service category maybe?) but there are lots of pages that don't have a similar or equivalent page. I presume I'll need to go through the content removing the links or replacing them where possible. My example is a specific staff member who no longer works there and is linked to from a category page, should i be redirecting from the old staff member and updating the anchor text, or just straight up replacing the whole thing to link to the right person? In most cases, these pages don't rank and I can't think of many that have any external websites linking to them. I'm over thinking all of this? Please help! 🙂
Technical SEO | | Adam_SEO_Learning0 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
Joomla creating duplicate pages, then the duplicate page's canonical points to itself - help!
Using Joomla, every time I create an article a subsequent duplicate page is create, such as: /latest-news/218-image-stabilization-task-used-to-develop-robot-brain-interface and /component/content/article?id=218:image-stabilization-task-used-to-develop-robot-brain-interface The latter being the duplicate. This wouldn't be too much of a problem, but the canonical tag on the duplicate is pointing to itself.. creating mayhem in Moz and Webmaster tools. We have hundreds of duplicates across our website and I'm very concerned with the impact this is having on our SEO! I've tried plugins such as sh404SEF and Styleware extensions, however to no avail. Can anyone help or know of any plugins to fix the canonicals?
Technical SEO | | JamesPearce0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
A sitemap... What's the purpose?
Hello everybody, my question is really simple: what's the purpose of a sitemap? It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily! Am I wrong? Thank you for yours answers, Jonathan
Technical SEO | | JonathanLeplang0 -
Does 'framing' a website create duplicate content?
Something I have not come across before, but hope others here are able offer advice based on experience: A client has independently created a series of mini-sites, aimed at targeting specific locations. The tactic has worked very well and they have achieved a large amount of well targeted traffic as a result. Each mini-site is different but then in the nav, if you want to view prices or go to the booking page, that then links to what at first appears to be their main site. However, you then notice that the URL is actually situated on the mini-site. What they have done is 'framed' the main site so that it appears exactly the same even when navigating through this exact replica site. Checking the code, there is almost nothing there - in fact there is actually no content at all. Below the head, there is a piece of code: <frameset rows="*" framespacing=0 frameborder=0> <frame src="[http://www.example.com](view-source:http://www.yellowskips.com/)" frameborder=0 marginwidth=0 marginheight=0> <noframes>Your browser does not support frames. Click [here](http://www.example.com) to view.noframes> frameset> Given that main site content does not appear to show in the source code, do we have an issue with duplicate content? This issue is that these 'referrals' are showing in Analytics, despite the fact that the code does not appear in the source, which is slightly confusing for me. They have done this without consultation and I'm very concerned that this could potentially be creating duplicate content of their ENTIRE main site on dozens of mini-sites. I should also add that there are no links to the mini-sites from the main site, so if you guys advise that this is creating duplicate content, I would not be worried about creating a link-wheel if I advise them to link directly to the main site rather than the framed pages. Thanks!
Technical SEO | | RiceMedia0