I would go further than that and check the link profiles of all the domains. Any sign of spam, unnatural anchor text etc then do not redirect as you'll inject that problem into your site. Even if you believe them to be dormant and never used I would just check in case you were not the original owner. Always worth doing that check.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Best posts made by MickEdwards
-
RE: I have multiple URLs that redirect to the same website. Is this an issue?
-
RE: Best tools for an initial website health check?
ScreamingFrog gives all the data you want. Tools for the purpose of a creating a sleek report usually don't give the full picture. It's those issues you draw out yourself that makes a difference.
-
RE: Should I set up no index no follow on low quality pages?
As Ryan suggests you still want to FOLLOW rather than giving the bots a dead end as I notice your heading suggests no-follow.
-
RE: Using hreflang="en" instead of hreflang="en-gb"
From my understanding if you have hreflang=“en-gb” then that/those pages are targeted at the UK. If you wish to target any English speaking countries then you add hreflang=“en”. But if you wish to target specific English speaking countries then you'd use hreflang="en-ie", hreflang="en-gg" etc.
What you are doing is giving Google information, not a directive, as to what pages are targeted for where. Google could ignore and it's not a ranking solution. You are just giving Google the heads up of your intentions.
-
RE: Robots.txt: how to exclude sub-directories correctly?
Install Yoast Wordpress SEO plugin and use that to restrict what is indexed and what is allowed in a sitemap.