Updating Domain Yearly for Branding Purposes?
-
We host an event every year which updates it's branding to include the year (Ex. Acme 2013, Acme 2014, etc.). We used to use the branded domains (acme2012.com) to redirect to the event microsite on off of our main content site (ex. acme2012.acme.com). Just this year our event managers decided host the event on it's own domain, separating it from the content site. So it is now acme2013.com.
Their strategy is to update the domain every year to include the current year, so next year the domain would be acme2014.com. I understand the use of the domain for marketing purposes but I feel like it could be hurting our SEO by doing this. Should we revert to our original strategy or is this okay with the proper setup?
I should note that because this is an event promotion site, the site template and content updates every year.
-
We typically use this model for most of our other event sites and I would agree. Thanks!
-
Thanks. As organic traffic is important for our site, I think it makes sense to avoid starting over with a new domain every year.
-
Or another way to look at it, have the main domain acme.com and update the meta info and on page content for the current year. That way you're building up your site strength in many ways: links, domain age, etc... Then categorize each year for future reference by acme.com/2011, acme.com/2012, etc. Then you have a repository for content from each year's event (white papers, videos, etc...). Added bonus - you don't have to maintain so many domains and websites which could eventually become problematic. Instead focus on branding the shortened URL in the minds of your audience.
-
Does most of the traffic come from referrals or from organic search?
If you are targeting organic search, you would probably be better off using the acme2013.com domain as a redirect (for those that want to go directly to the site, easier to type in) and hosting the content as a subdomain of your main site. You can piggyback off the domain authority you've built with your main domain to gain added exposure in search engines - as opposed to starting from scratch with a new domain.
If you aren't concerned with search traffic, it might be easier for people to remember a shorter domain than a longer domain, so creating a new domain each year would be the way to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Via this intermediate link issue - for (multiple domains) same brand
Hi, I have sudden increase in links pointing to my .com website from my .ca website, i have recently launched a new ,.ca website to target Canada and i don't know why i see 1000's of links from .ca is point to my .com websites has backlinks and i am afraid it could hurt my seo for .com, because ,ca is having no domain authority or no ranking currently. however .com has good rankings currently with decent link profile Can any one help me on how i can get rid of this backlinks from .ca to com should i just add no follow links to all my links which are pointing from .ca to .com? Please help Regards Anoop
Technical SEO | | Vitarockstore0 -
Help: buy domain from Tradenames.com?
Hello to all, I'm Silvia. I am writing to ask if any of you know this site: tradenames.com. It is a domains broker. They contacted my client and would like to sell the .com business domain (my client currently has the .it). Does anyone know them? Thanks you for your help.
Technical SEO | | advmedialab0 -
Robots.txt blocking Addon Domains
I have this site as my primary domain: http://www.libertyresourcedirectory.com/ I don't want to give spiders access to the site at all so I tried to do a simple Disallow: / in the robots.txt. As a test I tried to crawl it with Screaming Frog afterwards and it didn't do anything. (Excellent.) However, there's a problem. In GWT, I got an alert that Google couldn't crawl ANY of my sites because of robots.txt issues. Changing the robots.txt on my primary domain, changed it for ALL my addon domains. (Ex. http://ethanglover.biz/ ) From a directory point of view, this makes sense, from a spider point of view, it doesn't. As a solution, I changed the robots.txt file back and added a robots meta tag to the primary domain. (noindex, nofollow). But this doesn't seem to be having any effect. As I understand it, the robots.txt takes priority. How can I separate all this out to allow domains to have different rules? I've tried uploading a separate robots.txt to the addon domain folders, but it's completely ignored. Even going to ethanglover.biz/robots.txt gave me the primary domain version of the file. (SERIOUSLY! I've tested this 100 times in many ways.) Has anyone experienced this? Am I in the twilight zone? Any known fixes? Thanks. Proof I'm not crazy in attached video. robotstxt_addon_domain.mp4
Technical SEO | | eglove0 -
Which domain we should continue with?
Hello All, We are working with a client who had manual penalty from Google. We worked on that and now penalty has been removed. Client had already started working on the new domain and now the big dilemma is- Which domain should we continue with? Old or New? We are suggesting them to continue with the old one as that domain had good PR, good backlinks, better visibility on their social profiles etc. What do you suggest? any inputs are highly appreciated. Thanks
Technical SEO | | sachin-sv0 -
Why are my Duplicated Pages not being updated?
I've recently changed a bunch of duplicated pages from our site. I did get a slightly minimized amount of duplicated pages, however, some of the pages that I've already fixed are still unfixed according to MOZ. Whenever I check the back-end of each of these pages, I see that they've already been changed and non of them are the same in terms of Meta Tag Title is concern. Can anyone provide any suggestions on what I should do to get a more accurate result? Is there a process that I'm missing?
Technical SEO | | ckroaster0 -
Country Specific Domains
Is there any type of "best practice" for country level domains? I run a TLD .com, and have a few country specific domains (.co.uk, .eu, ...). Right now, I'm not doing anything with them. Previously, I had them redirected to the main .com, but didn't want to anger the Google gods with any type of duplicate content, redirects, or anything of that nature. Any suggestions on how to best utalize these domains?
Technical SEO | | ShippingContainer0 -
Lots of Domains Going Nowhere - Point to a Real Domain?
I have hundreds of domains that I have purchased over the years that arent going anywhere except GoDaddy's Cash Parking system, which returns very little revenue, if at all. I wonder if it would make more sense to just point these domains to actually e-commerce sites that I own. If so, how best to take these domains and point them so that SEO credit is given properly. Most of these available domains dont have anything to do with the e-commerce stores. So not sure it would help. Furthermore, if I were to purchase new domains that were more relevant to the keywords to our e-commerce sites, how best to set them up so we can generate traffic on them and point them over to the actual domains? Many thanks.
Technical SEO | | findachristianjob0 -
Sitemap with References to Second Domain
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content. I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me... The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic. Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
Technical SEO | | ShaMenz0