Any downside to a whole bunch of 301s?
-
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages.
My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https.
Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page?
Thanks! Best... Mike
-
Hi Michael!
I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960
The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?
Other things to watch out for with chained redirects:
- Avoid infinite loops.
- Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot.
- Minimizing redirects can improve page speed
Hope this helps!
-
Thank you to everyone for chipping in their thoughts on this.
Logan, good article. It gave me a new idea and wanted to see what y'all thought.
If my main goal is to not have all these 404s from unpublished pages and to re-direct the incoming link value to pages that could benefit, what would you think of putting up a noindexed page that links to my top pages that I want to give greater authority to? Then, put in a request to de-index those old urls that have the noindexed (duplicate) content. That would mean not firing off a 404, just showing the same content on hundreds of noindexed/deindexed pages. Given your point about re-directs, chained re-directs and speed for mobile, would that do more for me than re-directing all of these old urls to new pages?
Compounding the problem a little, this particular site has a catalog that comes out twice a year where many product pages are constantly being unpublished. So, even if I re-directed the old unpublished pages to existing urls, some of those might be going away and need another re-direct to add to the chain shortly.
Any thoughts on this appreciated. Thanks! Best... Mike
-
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
As I understand it, there is two aspects to 301 redirects.
- User experience
- Organic search
Matt Cutts says, there is no limit the number of 301 redirects, unless they are chained together. (ie. start_page > page1 > page2 > proper_page)
I don't expect it will impact on site speed much, nothing you couldn't regain with a bit of speed optimisation.
From a user perspective if you have moved an old page that has high traffic or some good quality links on it. It is very important to ensure that traffic N is back on the right page using a 301.
From organic search perspective (especially Google) again if you are using 301 is it will eventually update its own index to include the new page indicated.
There are two things you should be aware of: -
- By using a 301 from an old page, you could resurrect a bad back link
- A small amount of link authority is lost (only very small)
-
What happens when you have thousands? Is it sensible to remove 301's from say, two years ago?
-
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
Hi Logan,
Thanks for the insight. Would a few hundred re-directs be a site speed bummer for Shopify hosted site? I've worked on other sites that had decent speed and hundreds of re-directs. Firing off spitstorm of 404s on urls that used to be landing pages for links seems sub-optimal as well.
Best... Mike
-
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Migration: Better to have .301s processed or 200s?
I'm migrating sub-domains to sub-folders, but this question is likely applicable for most URL migrations. For example: subdomain1.example.com to example.com/subdomain1 and any child pages. Bear with me as it may just be me but I'm having trouble understanding whether internal links (menu, contextual etc and potentially the sitemaps) should be kept as the pre-migration URL (with .301 in place to the new URL) to give Google a chance to process the redirects or if they should be updated straight away to the new URL to provide a 200 response as so many guides suggest. The reason I ask is unless Google specifically visits the old URL from their index (and therefore processes the .301), it's likely to be found by following internal links on the website or similar which if they're updated to reflect the new URL will return a 200. I would imagine that this would be treated as a new page, which is concerning as it would have a canonical pointing toward itself and the same content as the pre-migrated URL. Is this a problem? Do we need to allow proper processing of redirects for migrations or is Google smarter than this and can work it out if they visit the old URL at a later date and put two and two together? What happens in-between? I haven't seen any migration guides suggest leaving .301s in place but to amend links to 200 as soon as possible in all instances. One thought is I guess there's also the Fetch as Google tool within Search Console which could be used with the old URLs - could this be relied on? Apologies if this topic has been covered before but it's quite difficult to search for without returning generic topics around .301 redirects. Hope it makes sense - appreciate any responses!
Intermediate & Advanced SEO | | AmyCatlow0 -
Moving to https with a bunch of redirects my programmer can't handle
Hi Mozzers, I referred a client of mine (last time) to a programmer that can transition their site from http to https. They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages. Currently, the way EPS redirects is setup (as shown in the attachment) is simple: On the left side you enter your old url, and on the the right side is the newly 301'd url. But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well. What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https. As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS. But my client is worried the next plugin update he will lose all http to https redirects. While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website. Well, we don't want to lose all 400 301's and jeopardize our SEO. Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls? Thank you all! Ol8km
Intermediate & Advanced SEO | | Shawn1240 -
Should I Keep adding 301s or use a noindex,follow/canonical or a 404 in this situation?
Hi Mozzers, I feel I am facing a double edge sword situation. I am in the process of migrating 4 domains into one. I am in the process of creating URL redirect mapping The pages I am having the most issues are the event pages that are past due but carry some value as they generally have one external followed link. www.example.com/event-2008 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2007 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2006 301 redirect to www.newdomain.com/event-2016 Again these old events aren't necessarily important in terms of link equity but do carry some and at the same time keep adding multiple 301s pointing to the same page may not be a good ideas as it will increase the page speed load time which will affect the new site's performance. If i add a 404 I will lose the bit of equity in those. No index,follow may work since it won't index the old domain nor the page itself but still not 100% sure about it. I am not sure how a canonical would work since it would keep the old domain live. At this point I am not sure which direction I should follow? Thanks for your answers!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Downsides of Squarespace v WooCommerce
Hello Mozzers! I'm thinking of using either Squarespace or WooCommerce for an ecommerce website. WooCommerce is great - I've used it before... how flexible is Squarespace from an SEO perspective, compared to WooCommerce? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
How to redirect whole site to home page without breaking wordpress
Hi all I had a phpprobid site which was heavily indexed but got hacked. I have deleted the old site and installed wordpress and a holding page. I can't work out how to 301 redirect all the old indexed pages to the home page without the existing wordpress redirect. Anyone care to help?
Intermediate & Advanced SEO | | RaceMedia0 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
What is the downside with having too long of a title tag?
With Google placing so much relevance on title tags, it seems to help if you mention local cities within your title tag. I'm wondering if the positive of having more keywords in a title tag outweighs the negative of having too long of one?
Intermediate & Advanced SEO | | TLSNET0