Any downside to a whole bunch of 301s?
-
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages.
My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https.
Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page?
Thanks! Best... Mike
-
Hi Michael!
I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960
The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?
Other things to watch out for with chained redirects:
- Avoid infinite loops.
- Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot.
- Minimizing redirects can improve page speed
Hope this helps!
-
Thank you to everyone for chipping in their thoughts on this.
Logan, good article. It gave me a new idea and wanted to see what y'all thought.
If my main goal is to not have all these 404s from unpublished pages and to re-direct the incoming link value to pages that could benefit, what would you think of putting up a noindexed page that links to my top pages that I want to give greater authority to? Then, put in a request to de-index those old urls that have the noindexed (duplicate) content. That would mean not firing off a 404, just showing the same content on hundreds of noindexed/deindexed pages. Given your point about re-directs, chained re-directs and speed for mobile, would that do more for me than re-directing all of these old urls to new pages?
Compounding the problem a little, this particular site has a catalog that comes out twice a year where many product pages are constantly being unpublished. So, even if I re-directed the old unpublished pages to existing urls, some of those might be going away and need another re-direct to add to the chain shortly.
Any thoughts on this appreciated. Thanks! Best... Mike
-
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
As I understand it, there is two aspects to 301 redirects.
- User experience
- Organic search
Matt Cutts says, there is no limit the number of 301 redirects, unless they are chained together. (ie. start_page > page1 > page2 > proper_page)
I don't expect it will impact on site speed much, nothing you couldn't regain with a bit of speed optimisation.
From a user perspective if you have moved an old page that has high traffic or some good quality links on it. It is very important to ensure that traffic N is back on the right page using a 301.
From organic search perspective (especially Google) again if you are using 301 is it will eventually update its own index to include the new page indicated.
There are two things you should be aware of: -
- By using a 301 from an old page, you could resurrect a bad back link
- A small amount of link authority is lost (only very small)
-
What happens when you have thousands? Is it sensible to remove 301's from say, two years ago?
-
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
Hi Logan,
Thanks for the insight. Would a few hundred re-directs be a site speed bummer for Shopify hosted site? I've worked on other sites that had decent speed and hundreds of re-directs. Firing off spitstorm of 404s on urls that used to be landing pages for links seems sub-optimal as well.
Best... Mike
-
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Migration: Better to have .301s processed or 200s?
I'm migrating sub-domains to sub-folders, but this question is likely applicable for most URL migrations. For example: subdomain1.example.com to example.com/subdomain1 and any child pages. Bear with me as it may just be me but I'm having trouble understanding whether internal links (menu, contextual etc and potentially the sitemaps) should be kept as the pre-migration URL (with .301 in place to the new URL) to give Google a chance to process the redirects or if they should be updated straight away to the new URL to provide a 200 response as so many guides suggest. The reason I ask is unless Google specifically visits the old URL from their index (and therefore processes the .301), it's likely to be found by following internal links on the website or similar which if they're updated to reflect the new URL will return a 200. I would imagine that this would be treated as a new page, which is concerning as it would have a canonical pointing toward itself and the same content as the pre-migrated URL. Is this a problem? Do we need to allow proper processing of redirects for migrations or is Google smarter than this and can work it out if they visit the old URL at a later date and put two and two together? What happens in-between? I haven't seen any migration guides suggest leaving .301s in place but to amend links to 200 as soon as possible in all instances. One thought is I guess there's also the Fetch as Google tool within Search Console which could be used with the old URLs - could this be relied on? Apologies if this topic has been covered before but it's quite difficult to search for without returning generic topics around .301 redirects. Hope it makes sense - appreciate any responses!
Intermediate & Advanced SEO | | AmyCatlow0 -
SEO penalty for changing domains by simply switching DNS on Wordpress and adding 301s server-side?
Working on a domain change for a client. They're hosted on Wordpress and their developer wants to simply switch out the DNS for the new domain to point to wordpress, and then have the old domain use 301s to redirect to the new domain. The url structure will be the same, but there will be no CMS connected to the old domain after the switch. Is this dangerous for SEO? A significant portion of their customers are from organic traffic and losing SEO value would be very bad.
Intermediate & Advanced SEO | | dfolwell0 -
Existing 301s during site migration - what to do?
Hi - I'm looking at an old website and there are lots of 301s internal to that site - what do I do with these when I move to a new site? Should I list them and adjust them so they redirect to the new site now (instead of from one URL to another URL on the old site) - I'm thinking that if I don't the user will have to travel through one 301 then another to get to the new site, which doesn't seem like a great idea? Your thoughts would be welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Unnecessary 301s?
hi mozzers, I'm doing an audit on a website. I detected over 60 301s of this nature: www.example.com/help 301d to www.example.com/help/. I believe these are completely useless and increase page load time. Am I right? should i kill those 301s? Thanks
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
URL Re-Writes & HTTPS: Link juice loss from 301s?
Our URLs are not following a lot of the best practices found here: http://moz.com/blog/11-best-practices-for-urls We have also been waiting to implement HTTPS. I think it might be time to take the plunge on re-writing the URLs and converting to a fully secure site, but I am concerned about ranking dips from the lost link juice from the 301s. Many of our URLs are very old, with a decent amount of quality links. Are we better off leaving as is or taking the plunge?
Intermediate & Advanced SEO | | TheDude0 -
Should we move a strong category page, or the whole domain to new domain?
We are debating moving a strong category page (and subcategory, product pages) from our current older domain to a new domain vs just moving the whole domain. The older domain has DA 40+, and the category page has PA 40+. Anyone with experience on how much PR etc will get passed to a virgin domain if we just redirect olddomain/strongcategorypage/ to newdomain.com? If the answer is little to none, we might consider just moving the whole site since the other categories are not that strong anyway. We will use 301 approach either way. Thanks!
Intermediate & Advanced SEO | | Durand0 -
How to redirect whole site to home page without breaking wordpress
Hi all I had a phpprobid site which was heavily indexed but got hacked. I have deleted the old site and installed wordpress and a holding page. I can't work out how to 301 redirect all the old indexed pages to the home page without the existing wordpress redirect. Anyone care to help?
Intermediate & Advanced SEO | | RaceMedia0 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0