301s - A Year Late
-
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons.
Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway.
Any thoughts on whether this is worth my time and effort?
-
Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages.
When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time)
[Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.]
Some more suggestions for the process:
- Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track.
- Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation.
- Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links.
- Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed.
- If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration.
- to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster)
- Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped.
Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.)
So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner.
Hope those are useful ideas?
Paul
-
Hiya, they may have lost link juice but then again there may be a blog giving you praise with a link that still "active". It's never too late to make a 301, remember though its best to 301 to the most relevant category or closest page. You can also set up a soft 404 page so even if you miss one the user can still navigate to a page like home page.
Moz has some great tips if you want a read or to refresh your mind.
-
Yes. It's better late than never. You might not get any rank but I consider 301s to be good policy beyond the SEO aspect. I hate going to a page where there's a link and I click it and I get a 404 or bounced to the front page. Perhaps I have a bookmark. Perhaps it's an old link. Whatever the case, do your visitors a courtesy and redirect them to the correct page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Google Search Console 'Change of Address' Just 301s on source domain?
Hi all. New here, so please be gentle. 🙂 I've developed a new site, where my client also wanted to rebrand from .co.nz to .nz On the source (co.nz) domain, I've setup a load of 301 redirects to the relevant new page on the new domain (the URL structure is changing as well).
Technical SEO | | WebGuyNZ
E.G. On the old domain: https://www.mysite.co.nz/myonlinestore/t-shirt.html
In the HTACCESS on the old/source domain, I've setup 301's (using RewriteRule).
So that when **https://www.mysite.co.nz/**myonlinestore/t-shirt.html is accessed, it does a 301 to;
https://mysite.nz/shop/clothes/t-shirt All these 301's are working fine. I've checked in dev tools and a 301 is being returned. My question is, is having the 301's just on the source domain only enough, in regards to starting a 'Change of Address' in Google's Search Console? Their wording indicates it's enough but I'm concerned, maybe I also need redirects on the target domain as well? I.E. Does the Search Console Change of Address process work this way?
It looks at the source domain URL (that's already in Google's index), sees the 301 then updates the index (and hopefully pass the link juice) to the new URL. Also, I've setup both source and target Search Console properties as Domain Properties. Does that mean I no longer need to specify that the source and target properties are HTTP or HTTPS? I couldn't see that option when I created the properties. Thanks!0 -
301 Redirects a Year Later
I inherited the digital maintenance of a website that was relaunched a year ago. In looking at Google Analytics, organic search a year later is still down 33%. I fear they did not install 301 Redirects but can't really get a specific answer from them. Is it possible to install them a year later to help with Google indexing and get back some of the organic traffic?
Technical SEO | | stansamples0 -
We are migrating a site and are seeing alot of 301s and 302s already in the old site is it ok to leave those as is?
For the 3xx’s I’m not sure if it’s okay for us to redirect to these so please advise on that
Technical SEO | | lina_digital0 -
Late loading content via AJAX - impact to bots
Hi, In an attempt to reduce the latency of our site, we are planning on late-loading all content below the fold via AJAX. My concern is Googlebot won't see this content and won't index it properly (our site is very large and we have lots of content). What is good for our users is not necessarily good for bots. Will late loading AJAX content be read by Googlebot? Thoughts on how to balance speed vs search engine crawl-ability?
Technical SEO | | NicB10 -
Pagerank and 301s
Hi all For various reasons some of our pages were renamed from: http://www.meresverige.dk/rejser/malmoe to: http://www.meresverige.dk/rejser/malmo We have made proper 301 redirects and also updated sitemap.xml accordingly. The change was done about 5th of September. The content on the pages remain identical. This page, and all pages below it in the structure now get very low or no page-rank at all. Much lower than it was before the name change. Any ideas to how long, if ever, it will take for Google to transfer the page-rank from the old page? Any suggestions to what we can do to make the process faster?
Technical SEO | | Resultify0 -
301s and Link Juice
So I know that a 301 will pass the majority of link juice to the new site, but if that 301 is taken away what happens?
Technical SEO | | kylesuss0 -
Are 301s advisable for low-traffic URL's?
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed? In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered? This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
Technical SEO | | PGRob0