Archiving a festival website - subdomain or directory?
-
Hi guys
I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!)
We don't archive our past festivals online, but I'd like to start doing so for a number of reasons
1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival.
2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage)
Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012
However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc.
My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes?
Hope this all makes sense. Many thanks!
-
I work with a lot of arts events on a minimal budget and I normally move the old website to: www.website.com.au/2011, www.website.com.au/2012 and make the current year www.website.com
In this way you're still getting benefit from having a large website, the archive of the previous events is still there, artists still have a presence and people still find the website in the SERPs, navigate to the homepage which is the current event.
-
Cheers mate, much appreciated!
Could you kindly outline why it may be a bit more beneficial from a search perspective to use a directory? Just interested...
Cheers
-
From a search perspective, It may be a bit more beneficial to use the directory route but that is often difficult to organize when we are talking about multiple pages (ie a previous years entire website).
In my opinion, the subdomain route seems like the best option. It will allow for the previous pages and content to contribute to the current sites domain authority while also serving as an extremely organized presentation of the previous years events and attractions!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unrelated subdomain hurts domain rankins?
Hi All, One of our subdomains has lot of content created by different users and mostly they are outgoing links from landing pages. Moreover the top ranking content is about "cigarettes" which is nowhere related to our niche. Will this hurt our domain rankings?
Intermediate & Advanced SEO | | vtmoz0 -
Linking to External Websites?
Is it good to link external websites from every page. Since, the on-page grader shows there should be one link pointing to an external source. I have a website that can point to an external website from every page using the brand name of the specific site like deal sites do have. Is it worth having external link on every page, of-course with a no-follow tag?
Intermediate & Advanced SEO | | welcomecure0 -
Rel=canonical on pre-migration website
I have an e-commerce client that is migrating platforms. The current structure of their existing website has led to what I would believe to be mass duplicate content. They have something north of 150,000 indexed URLs. However, 143,000+ of these have query strings and the content is identical to pages without any query string. Even so, the site does pretty well from an organic stand point compared to many of its direct competitors. Here is my question: (1) I am assuming that I should go into WMT (Google/Bing) and tell both search engines to ignore query strings. (2) In a review of back links, it does appear that there is a mish mash of good incoming links both to the clean and the dirty URLs. Should I add a rel=canonical via a script to all the pages with query strings before we make our migration and allow the search engines some time to process? (3) I'm assuming I can continue to watch the indexation of the URLs, but should I also tell search engines to remove the URLs of the dirty URLs? (4) Should I do Fetch in WMT? And if so, what sequence should I do for 1-4. How long should I wait between doing the above and undertaking the migration?
Intermediate & Advanced SEO | | ExploreConsulting0 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
SEO tips for linking a website to ebay
Hi guys, I am helping a client link their website to Ebay and I am wondering if there is anything specific I should be doing in terms for search engine optimisation. Any advice would be really helpful. Thanks!
Intermediate & Advanced SEO | | StoryScout0 -
How to deal with duplicates on an e-commerce website
Hi guys, So we have an e-commerce website and we have some products that are exactly the same but come in different colours. Lets say for example we have a Samsonite Chronolite and this bag comes in 55cm, 65cm and 75cm variations. The same bag also may come in 4 different colours. The bags are the same and therefore have the same information besides maybe the title tag varying due to the size and colour. But the descriptions are the same. How do I avoid Google thinking I am duplicating pages or have duplicated pages. Google things we have duplicated when the scenario is as I have explained. Any suggestions? Best regards,
Intermediate & Advanced SEO | | iBags2 -
How do I list the subdomains of a domain?
Hi Mozers, I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information? The only way I could think of doing it is to run a google search on; site:example.com -site:www.example.com The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www). Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information? Cheers, Dan
Intermediate & Advanced SEO | | djlaidler0 -
Should I Combine 30 websites into one?
I have a Private health care company that I have just begun consulting for. Currently in addition to the main website serving the whole group, 30 individual sites which are for each of the hospitals in their group. Each has it's own domain. Each site, has practically identical content: something that will be addressed in my initial audits. But should I suggest that they combine all the sites into one domain, providing individual category pages for each hosptial, or am I really going to suggest that each of the 30 sites, create unique content of their own. This means thirty pages of content on "hip replacements" thirty different versions of "our treatement" etc, and bearing in mind they all run off the same CMS, even with different body text, the pages are going to be practically identical. It's a big call either way! The reason they started out with all these sites, is that each hospital is it's own cost centre and whilst the web development team is a centralized resource. They each have their own sites to try and rank indivdually for local searches, naturally as they will each tend to get customers from their own local area. Not every hospital provides the full range of treatments.
Intermediate & Advanced SEO | | Ultramod0