Any downside to a whole bunch of 301s?
-
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages.
My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https.
Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page?
Thanks! Best... Mike
-
Hi Michael!
I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960
The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?
Other things to watch out for with chained redirects:
- Avoid infinite loops.
- Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot.
- Minimizing redirects can improve page speed
Hope this helps!
-
Thank you to everyone for chipping in their thoughts on this.
Logan, good article. It gave me a new idea and wanted to see what y'all thought.
If my main goal is to not have all these 404s from unpublished pages and to re-direct the incoming link value to pages that could benefit, what would you think of putting up a noindexed page that links to my top pages that I want to give greater authority to? Then, put in a request to de-index those old urls that have the noindexed (duplicate) content. That would mean not firing off a 404, just showing the same content on hundreds of noindexed/deindexed pages. Given your point about re-directs, chained re-directs and speed for mobile, would that do more for me than re-directing all of these old urls to new pages?
Compounding the problem a little, this particular site has a catalog that comes out twice a year where many product pages are constantly being unpublished. So, even if I re-directed the old unpublished pages to existing urls, some of those might be going away and need another re-direct to add to the chain shortly.
Any thoughts on this appreciated. Thanks! Best... Mike
-
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
As I understand it, there is two aspects to 301 redirects.
- User experience
- Organic search
Matt Cutts says, there is no limit the number of 301 redirects, unless they are chained together. (ie. start_page > page1 > page2 > proper_page)
I don't expect it will impact on site speed much, nothing you couldn't regain with a bit of speed optimisation.
From a user perspective if you have moved an old page that has high traffic or some good quality links on it. It is very important to ensure that traffic N is back on the right page using a 301.
From organic search perspective (especially Google) again if you are using 301 is it will eventually update its own index to include the new page indicated.
There are two things you should be aware of: -
- By using a 301 from an old page, you could resurrect a bad back link
- A small amount of link authority is lost (only very small)
-
What happens when you have thousands? Is it sensible to remove 301's from say, two years ago?
-
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
Hi Logan,
Thanks for the insight. Would a few hundred re-directs be a site speed bummer for Shopify hosted site? I've worked on other sites that had decent speed and hundreds of re-directs. Firing off spitstorm of 404s on urls that used to be landing pages for links seems sub-optimal as well.
Best... Mike
-
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If there any SEO downside in using Google+ brand page for news curation?
We are thinking about using our Google+ brand page to curate relevant news from different sources and organize them in Collections. We are confident that we can generate backlinks, followers, and engagement with this strategy. My fear is to suffer some penalty due to the fact that will not be sharing our own content. We will be redirecting the clicks to the website of the owner of the content; using Start a Fire tracking links (https://startafire.com/). Since I am not aware of any Google+ brand page that executed this curated news strategy with success, I decided to post this question. Our goal is to get high ranks for our Google+ brand page for searches to our brand name and for the name of the Collections. BTW, our curated news posts will be automated.
Intermediate & Advanced SEO | | grinseo0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
I just want a second opinion 🙂 The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
Intermediate & Advanced SEO | | Zanox0 -
Does duplicate content penalize the whole site or just the pages affected?
I am trying to assess the impact of duplicate content on our e-commerce site and I need to know if the duplicate content is affecting only the pages that contain the dupe content or does it affect the whole site? In Google that is. But of course. Lol
Intermediate & Advanced SEO | | bjs20100 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
How do you de-index and prevent indexation of a whole domain?
I have parts of an online portal displaying in SERPs which it definitely shouldn't be. It's due to thoughtless developers but I need to have the whole portal's domain de-indexed and prevented from future indexing. I'm not too tech savvy but how is this achieved? No index? Robots? thanks
Intermediate & Advanced SEO | | Martin_S0 -
Making a whole site website SSL (https)
Our IT department wants to make a change to our website and serve all the pages under SSL (https). This will be happening at the same time as a move from classic ASP to ASP.Net so the file extensions for non re-written urls will change (this doesn't equate to many). They will be ensuring everything is 301 redirected correctly. Even with this I can't help being very nervous about the change. We have tens of thousands of links to the website gained over many years, and I understand even with 301'ing them they will lose some of their value. We receive tens of thousands of natural visitors per day. Has anyone done anything like this before, or have any advice on whether it is the right thing to do?
Intermediate & Advanced SEO | | BigMiniMan0 -
I have 4,100 302 redirects; How can I change so many to 301s
hi, i have way to many 302 redirects, how can i bulk change these to 301 i have started in cpanel but i could be old by the time i finsih
Intermediate & Advanced SEO | | freedomelectronics1