301s - A Year Late
-
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons.
Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway.
Any thoughts on whether this is worth my time and effort?
-
Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages.
When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time)
[Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.]
Some more suggestions for the process:
- Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track.
- Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation.
- Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links.
- Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed.
- If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration.
- to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster)
- Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped.
Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.)
So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner.
Hope those are useful ideas?
Paul
-
Hiya, they may have lost link juice but then again there may be a blog giving you praise with a link that still "active". It's never too late to make a 301, remember though its best to 301 to the most relevant category or closest page. You can also set up a soft 404 page so even if you miss one the user can still navigate to a page like home page.
Moz has some great tips if you want a read or to refresh your mind.
-
Yes. It's better late than never. You might not get any rank but I consider 301s to be good policy beyond the SEO aspect. I hate going to a page where there's a link and I click it and I get a 404 or bounced to the front page. Perhaps I have a bookmark. Perhaps it's an old link. Whatever the case, do your visitors a courtesy and redirect them to the correct page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass 301s
Hi All, im trying to find a way to do a mass list of 301s instead of just doing them individually, does anyone have any ideas or tips into how i can do this?
Technical SEO | | Kennelstore0 -
404 from a 404 that 301s
I must be missing something or skipping a step or lacking proper levels of caffeine. Under my High Priority warnings I have a handful of 404s which are like that on purpose but I'm not sure how Moz is finding them. When I check the referrer info, the 404 is being linked to from a different 404 which is now a 301 (due to craziness of our system and what was easiest for the coders to fix a different problem ages ago). Basically, if a user decides to type in a non-existent model number into the URL there is a specific 404 that comes up. While the 404 error is "site.com/product/?model=abc123" the referrer is "site.com/product?model=abc123" (or more simply, one slash is missing). I can't see how Moz is finding the referrer so I can't figure out how to make Moz stop crawling it. I actually have the same problem in Google WMT for the same group of 404s. What am I just not seeing that will fix this?
Technical SEO | | MikeRoberts0 -
Updating Domain Yearly for Branding Purposes?
We host an event every year which updates it's branding to include the year (Ex. Acme 2013, Acme 2014, etc.). We used to use the branded domains (acme2012.com) to redirect to the event microsite on off of our main content site (ex. acme2012.acme.com). Just this year our event managers decided host the event on it's own domain, separating it from the content site. So it is now acme2013.com. Their strategy is to update the domain every year to include the current year, so next year the domain would be acme2014.com. I understand the use of the domain for marketing purposes but I feel like it could be hurting our SEO by doing this. Should we revert to our original strategy or is this okay with the proper setup? I should note that because this is an event promotion site, the site template and content updates every year.
Technical SEO | | accessintel0 -
Guidance for setting up new 301s after having just done so (
Hi I've recently set up a load of 301 redirects for a clients new site design/structure relaunch One of the things we have done is take the kw out of the sub-category landing page url's since they now feature in the top level category page urls and don't want to risk over-optimisation by having kw repeats across the full urls. So the urls have changed and the original pages 301'd to the new current pages. However If rankings start to drop & i decide to change urls again to include kw in final part of url too for the sub category landing pages, whats best way to manage the new redirects ? Do i redirect the current urls (which have only been live for a week and have the original/old urls 301'd to them) to the new url's ? (worried this would create a chain of 301's which ive heard is not ideal) Or just redirect the original urls to the new ones, and can forget about the current pages/url's since only been live for a week ?
Technical SEO | | Dan-Lawrence
(I presume best not since GWT sitemaps area says most new urls indexed now so I presume sees those as the original pages replacement now) Or should they all be 301'd (original urls and current urls to the new) ? Or best to just run with current set up and avoid making too many changes again, and setting up even more 301's after having just done so ? Many Thanks 🙂 Dan0 -
Late loading content via AJAX - impact to bots
Hi, In an attempt to reduce the latency of our site, we are planning on late-loading all content below the fold via AJAX. My concern is Googlebot won't see this content and won't index it properly (our site is very large and we have lots of content). What is good for our users is not necessarily good for bots. Will late loading AJAX content be read by Googlebot? Thoughts on how to balance speed vs search engine crawl-ability?
Technical SEO | | NicB10 -
Handling 301s: Multiple pages to a single page (consolidation)
Been scouring the interwebs and haven't found much information on redirecting two serparate pages to a single new page. Here is what it boils down to: Let's say a website has two pages, both with good page authority of products that are becoming fazed out. The products, Widget A and Widget B, are still popular search terms, but they are being combined into ONE product, Widget C. While Widget A and Widget B STILL have plenty to do with Widget C, Widget C is now the new page, the main focus page, and the page you want everyone to see and Google to recognize. Now, do I 301 Widget A and Widget B pages to Widget C, ALTHOUGH Widgets A and B previously had nothing to do with one another? (Remember, we want to try and keep some of that authority the two page have had.) OR do we keep Widget A and Widget B pages "alive", take them off the main navigation, and then put a "disclaimer" on the pages announcing they are now part of Widget C and link to Widget C? OR Should Widgets A and B page be canonicalized to Widget C? Again, keep in mind, widgets A and B previously were not similar, but NOW they are and result in Widget C. (If you are confused, we can provide a REAL work example of what we are talkinga about, but decided to not be specific to our industry for this.) Appreciate any and all thoughts on this.
Technical SEO | | JU19850 -
Pagerank and 301s
Hi all For various reasons some of our pages were renamed from: http://www.meresverige.dk/rejser/malmoe to: http://www.meresverige.dk/rejser/malmo We have made proper 301 redirects and also updated sitemap.xml accordingly. The change was done about 5th of September. The content on the pages remain identical. This page, and all pages below it in the structure now get very low or no page-rank at all. Much lower than it was before the name change. Any ideas to how long, if ever, it will take for Google to transfer the page-rank from the old page? Any suggestions to what we can do to make the process faster?
Technical SEO | | Resultify0 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230