Archiving a festival website - subdomain or directory?
-
Hi guys
I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!)
We don't archive our past festivals online, but I'd like to start doing so for a number of reasons
1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival.
2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage)
Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012
However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc.
My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes?
Hope this all makes sense. Many thanks!
-
I work with a lot of arts events on a minimal budget and I normally move the old website to: www.website.com.au/2011, www.website.com.au/2012 and make the current year www.website.com
In this way you're still getting benefit from having a large website, the archive of the previous events is still there, artists still have a presence and people still find the website in the SERPs, navigate to the homepage which is the current event.
-
Cheers mate, much appreciated!
Could you kindly outline why it may be a bit more beneficial from a search perspective to use a directory? Just interested...
Cheers
-
From a search perspective, It may be a bit more beneficial to use the directory route but that is often difficult to organize when we are talking about multiple pages (ie a previous years entire website).
In my opinion, the subdomain route seems like the best option. It will allow for the previous pages and content to contribute to the current sites domain authority while also serving as an extremely organized presentation of the previous years events and attractions!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure for am International website with subdirectories
Hello, The company I am working for is launching a new ecommerce website (just a handful of products).
Intermediate & Advanced SEO | | Lvet
In the first phase, the website will be English only, but it will be possible to order internationally (20 countries).
In a second phase, new languages and countries will be added. I am wondering what is the best URL structure for launch: Start with a structure similar to website.com/language/content (later on we will add other languages than english) Start with a structure similar to website.com/country/content
3) Start with a structure similar to website.com/country-language/content (at the beginning it will be all website.com/country-en/content) What do you think? Cheers
Luca0 -
Single topic website or as part of a multiple topic website?
I have content sitting on a site here - https://www.pfizerpro.co.uk/product/xeljanz/rheumatoid-arthritis - domain authority 25 page authority 18 - the pages went live three months ago and the website was launched 18 months. We now have the option to use a brand new domain www.xeljanz.co.uk Which is the better option to stick with the www.pfizerpro.co.uk as it is a larger multiple topic site that should attract more links or to start a new single topic site which google may view as the better source as it is dedicated to the topic? Thanks
Intermediate & Advanced SEO | | Kate_team_DM0 -
Subdomained White-Label Sites
Wanted to pass along a specific use-case that I'm thinking through in the technical setup for a client. Site: http://www.abc.com is an ecommerce company that offers the ability to white-label a site so an affiliate can join and get access to the site, and ultimately get a cut of whatever is sold through that affiliate. So I join the site and get access to scott.xyz.com and can handle my business through that. From a technical standpoint, this is the proposed technical setup of the site. Canonical URLS will be set to www.xyz.com Pages on scott.xyz.com will be set to noindex, while the main www.xyz.com will be set to be indexed Webmaster Tools for scott.xyz.com will be set to have preferred domain of www.xyz.com scott.xyz.com will have separate robots.txt instructing to block crawl Questions Am I missing any steps in properly setting up the technical background of the subdomain sites? The use of subdomains isn't something that I am able to move away from. Will any links in to scott.xyz.com pass juice and authority to www.xyz.com, or does the noindex/nocrawl block that from happening? Is there anything else that I am missing? Thanks!
Intermediate & Advanced SEO | | RosemarieReed
Scott0 -
Subdomains + SEO
Hi everyone, So a little background - my company launched a new website (http://www.everyaction.com). The homepage is currently hosted on an amazon s3 bucket while the blog and landing pages are hosted within Hubspot. My question is - is that going to end up hurting our SEO in the long run? I've seen a much slower uptick in search engine traffic than I'm used to seeing when launching new sites and I'm wondering if that's because people are sharing the blog.everyaction.com url on social (which then wouldn't benefit just everyaction.com?) Anyways, a little help on what I should be considering when it comes to subdomains would be very helpful. Thanks, Devon
Intermediate & Advanced SEO | | EveryActionHQ0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Blogs and E-Commerce websites
I have recently launched an e-commerce website which has a whopping domain authority of 1! I was thinking about adding a blog to it (it's in open cart), but that would mean creating it in a wordpress but using the same domain name. Would this be beneficial from an SEO stand point (i.e sending traffic to w blog that isn't actually on the e-commerce website itself) , or am I better off creating content as blogs/articles on other people sites?
Intermediate & Advanced SEO | | lindsayjhopkins0 -
Export Website into XML File
Hi, I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML. What is best way to export content including meta tags for an entire site along with the steps on how to?
Intermediate & Advanced SEO | | Melia0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0