How to mitigate impact of retiring 20,000 old URLs from the previous website version (migration was in 2015)
-
Hello!
So, we migrated our website in 2015 to the "new version" and we now have 20,000 old URLs that we'd like to officially retire. the only traffic coming to these pages is obviously from backlinks pointing at our site.
How do i gauge the hit that our website will take once we retire these URLs?
Is there a tool that will allow me to look at referral traffic numbers per URL so that i know how much traffic we'll be losing?
any advice would be helpful!
thanks!
yael
-
So as I mentioned I would never delete a page without a redirect (even if that is a redirect to the homepage). If you do that, there will be a minimal amount of authority that you lose. There's no way to know exactly how much, but it should be marginal.
As for the referral traffic, I mentioned how to measure that above. But again with a 301 redirect, you will keep that traffic and if it is still relevant traffic, it should still add to your bottom line.
-
Hi NorthStarInbound, thanks for your response.
By "retiring" i mean deleting. They're old URLs that we'd like to get rid of and just let the new pages stand on their own.
But we would be giving up the authority that those pages have accumulated. The question is, how much authority is at stake? How large of a hit are we going to take? How can we measure this?
Any advice is helpful! Thanks, yael
-
What do you mean by "retiring"? Is there another page that you could redirect to so you don't lose the authority those backlinks are providing?
As for seeing how much referral traffic you could lose, you can use Google Analytics for that. Go to Behavior > Site Content > Landing Pages. Then in the secondary dimension, go to Acquisition > Medium.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidate URLs on Wordpress?
Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.
Intermediate & Advanced SEO | | wickstar1 -
If I want to update the title of a page on my website would that negatively impact SEO?
I want to update a few page titles on my website. Some are duplicate titles, and some titles are just too long. Will my website be negatively impacted at all if I update these? I read somewhere that once you have created a page you need to stick to the title you have given it. So I am not sure if I should leave these pages be and make note of utilizing better SEO practices for the future or if I can go back and edit them. Any insight is much appreciated!
Intermediate & Advanced SEO | | meredithrice0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Rel=canonical on pre-migration website
I have an e-commerce client that is migrating platforms. The current structure of their existing website has led to what I would believe to be mass duplicate content. They have something north of 150,000 indexed URLs. However, 143,000+ of these have query strings and the content is identical to pages without any query string. Even so, the site does pretty well from an organic stand point compared to many of its direct competitors. Here is my question: (1) I am assuming that I should go into WMT (Google/Bing) and tell both search engines to ignore query strings. (2) In a review of back links, it does appear that there is a mish mash of good incoming links both to the clean and the dirty URLs. Should I add a rel=canonical via a script to all the pages with query strings before we make our migration and allow the search engines some time to process? (3) I'm assuming I can continue to watch the indexation of the URLs, but should I also tell search engines to remove the URLs of the dirty URLs? (4) Should I do Fetch in WMT? And if so, what sequence should I do for 1-4. How long should I wait between doing the above and undertaking the migration?
Intermediate & Advanced SEO | | ExploreConsulting0 -
GTM Migration from Old to New Verison as Old Version closing on 1st April
Hi Guys, Can you please tell me is it a correct configuration for tracking thank you page? In Old Version of Tag Manager - GA conversion Tracking, tag type - universal analytic, web property id - UA-12345678-9
Intermediate & Advanced SEO | | devdan
track type - transaction, Firing rule -{{url}} contains ordersuccessful.aspx, {{event}} equals gtm.dom In New Version of Tag Manager - Choose Product - Google Analytic, choose tag type - universal analytic, configure tag - tracking id - UA-12345678-9, track type - transaction, Fire on -Name - order successful page, type - custom event, Filter- Page url contains ordersuccessful.aspx, event equals gtm.dom If i remove event equals gtm.dom will tag fire and transaction details will reflect in google analytic? I am doing Manually configuration in new version of GTM as i have only few tags so just want to know if all tags successfully configured & I placed new GTM code on my website then google analytic will start reflecting data at same moment right, it will not take 24 hours right? Thanks! Dev0 -
Massive URL Migration with thousands of 301
Hey Everyone! I'm currently working on a project that we have A Lot of product pages and we have thousands of URL's that need to be 301'd over. I know this can be a major issue and could lead to tons of errors. What is everyone's thought of doing such a huge Migration, Should I do it all in phases? or should I do them all at once so they can all be indexed together? What would you suggest to be the best way to go about doing such a massive migration?
Intermediate & Advanced SEO | | rpaiva0 -
Running Google Ads on the website will impact the Rankings?
Hi, Will Google AdSense those are running above the fold of the website, impact the keywords rankings?
Intermediate & Advanced SEO | | RuchiPardal0 -
Website layout for a new website [Over 50 Pages & targeting Long Tail Keywords]
Hey everyone, We are designing a new website with over 50 pages and I have a question regarding the layout. Should I target my long tail keywords via blog pages? It will be easier to manage and list and link out to similar articles related to my long tail keywords using a word press blog. For this example - lets suppose the website is www.orange.com and we sells 'Oranges' Am I going about this in the right way? Main Section: Main Section 1 : Home Page - Keyword Targeted - Orange Main Section 2 : Important Conversion page - 'Buy oranges' Long Tail Keyword (LTK) 1: www.orange.com/blog/LTK1 Subsection(SS): www.orange.com/blog/LTK1/SS1 www.orange.com/blog/LTK1/SS1a www.orange.com/blog/LTK1/SS1b Long Tail Keyword (LTK) 2: www.orange.com/blog/LTK2 Long Tail Keyword (LTK) 3: www.orange.com/blog/LTK3 Subsection(SS): www.orange.com/blog/LTK1/SS3 www.orange.com/blog/LTK1/SS3a www.orange.com/blog/LTK1/SS3b All these long tail pages and sub sections under them are built specifically for hosting content that targets these specific long tail keywords. Most of my traffic will come initially via the sub section pages - and it is important for me to rank well for these terms initially. _E.g. if someone searches for the keyword 'SS3b' on Google - my corresponding page www.orange.com/blog/LTK1/SS3b should rank well on the results page. _ For ranking purposes - will using this blog/category structure hurt or benefit me? Instead do you think I should build static pages? Also, we are targeting more than 50 long tail keywords - and building quality content for each of these keywords - and I assume that we will be doing this continuously. So in the long term term which is more beneficial? Do you have any suggestions on if I am going about this the right way? Apologies for using these random terms - oranges, LKT, SS etc in this example. However, I hope that the question is clear. Looking forward to some interesting answers on this! Please feel free to share your thoughts.. Thank you! Natasha
Intermediate & Advanced SEO | | Natashadogres0