Strategies for best use of competitors expired domain
-
I recently bought an old competitors expired domain that was ranking around the page 2 or 3 on Google for most keywords that I target.
Curious as to best strategy for utilizing this domain:
1. set up some content with back links to my own domain
2. Set up redirects to set up all of the competitors old domain URLs to corresponding sections on my website
3. Something else? -
Thank you both, finally getting around to doing this now!
-
Hi Sandi,
You make one page with a "press release" writing that the company is been taken over by (yourcompany.com). All the other URL's you redirect to this page. On this page you can link to the most important places you want on your own website.
This way the authority of the old competitors domain will be forwarded to yours. And after like 6 months/1year you could link the whole domain to your site.
As Thomas mentions below it is a good idea to check which links could be of use (https://moz.com/researchtools/ose/) and contact the most important to change the link domain to yours.
Hope it helps! Regards, Tymen
-
Hi Tymen! Thanks for your feedback sounds like a good idea! Could you please elaborate a bit what you mean as if you are trying to explain to someone that is a novice - ie "I would make a One-pager with a catch all URL's. This way all the old url's of the site will go to this page and you get no 404's. On this page you explain that there is nothing there anymore and you should go to your site. I would not put to many links on the page."
-
Expanding on what Tymen has said,
This could be a good strategy, but why would you not just 301 redirect the whole site to a page on your own site (explaining that they don't exist anymore). This way I see you getting more value to your site (one hop through the redirect instead of one hop from the redirect and the link)
Also, something that may be worth looking into is if they have any high value links, seeing where they come from and explaining the company is not existent and trying to get the links that they once had.
-
Hi Sandi,
I would make a One-pager with a catch all URL's. This way all the old url's of the site will go to this page and you get no 404's. On this page you explain that there is nothing there anymore and you should go to your site. I would not put to many links on the page.
Eventually the authority of the competitors site will go but if you have everything in place you will get it. What you could also do is login into the Seach Console account of the competitor and see which pages have good content. This content you could copy to your site before you take it offline.
Good luck!
Tymen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Ranking Above sub-domains
So I work for a company that has a very successful affiliate that operates under a third level domain name such as "region.company.com". Their SEO practices are very good and they rank highly in keyword searches. However "company.com" does not even though it is not a subdomain. Even after optimizing the company.com's pages etc, the regional sub domain ranks much higher for keywords and the main company fails to rank at all. Is Google discounting the main company's page? Is it a matter of trust or time? or is it something else? How can I get Google to prioritize the main company website rather than a lower level domain affiliate?
Intermediate & Advanced SEO | | Resolute0 -
Complementary Domain
Hi guys, I have the following situation I would like some help. Because my client is in Brazil, I will make up fictional names so it's easier to understand. My client is a shoe store whose domain is mangabeira.com. That is the brand name and will always be the main domain and reference of the website. We were offered the domain shoes.com. There is no intention of changing the brand name or anything, but there would be a redirect that would send the user who to mangabeira.com. My question is how much impact would that complementary domain do to my SEO performance and how that redirect must be handled. Thanks.
Intermediate & Advanced SEO | | LucasLopes0 -
Best Practice for Inter-Linking to CCTLD brand domains
Team, I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. : amazon.com
Intermediate & Advanced SEO | | tomypro
amazon.de
amazon.fr
amazon.it Currently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice. I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles. Anyone having any thoughts/opinions on this topic they are open to sharing? /Thomas0 -
Can I use a "no index, follow" command in a robot.txt file for a certain parameter on a domain?
I have a site that produces thousands of pages via file uploads. These pages are then linked to by users for others to download what they have uploaded. Naturally, the client has blocked the parameter which precedes these pages in an attempt to keep them from being indexed. What they did not consider, was they these pages are attracting hundreds of thousands of links that are not passing any authority to the main domain because they're being blocked in robots.txt Can I allow google to follow, but NOT index these pages via a robots.txt file --- or would this have to be done on a page by page basis?
Intermediate & Advanced SEO | | PapaRelevance0 -
When using ALT tags - are spaces, hyphens or underscores preferred by Google when using multiple words?
when plugging ALT tags into images, does Google prefer spaces, hyphens, or underscores? I know with filenames, hyphens or underscores are preferred and spaces are replaced with %20. Thoughts? Thanks!
Intermediate & Advanced SEO | | BrooklynCruiser3 -
Domain migration strategy
Imagine you have a large site on an aged and authoritative domain. For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable. The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain. The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc. My question is whether we should allow the new site to be indexed during the beta period? My gut feeling is yes for the following reasons: It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much. Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?
Intermediate & Advanced SEO | | Charlie_Coxhead0 -
Expiring URL seo
a buddy of mine is running a niche job board and is having issues with expiring URLs. we ruled it out cuz a 301 is meant to be used when the content has moved to another page, or the page was replaced. We were thinking that we'd be just stacking duplicate content on old urls that would never be 'replaced'. Rather they have been removed and will never come back. So 410 is appropriate but maybe we overlooked something. any ideas?
Intermediate & Advanced SEO | | malachiii0