Archiving a festival website - subdomain or directory?
-
Hi guys
I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!)
We don't archive our past festivals online, but I'd like to start doing so for a number of reasons
1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival.
2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage)
Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012
However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc.
My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes?
Hope this all makes sense. Many thanks!
-
I work with a lot of arts events on a minimal budget and I normally move the old website to: www.website.com.au/2011, www.website.com.au/2012 and make the current year www.website.com
In this way you're still getting benefit from having a large website, the archive of the previous events is still there, artists still have a presence and people still find the website in the SERPs, navigate to the homepage which is the current event.
-
Cheers mate, much appreciated!
Could you kindly outline why it may be a bit more beneficial from a search perspective to use a directory? Just interested...
Cheers
-
From a search perspective, It may be a bit more beneficial to use the directory route but that is often difficult to organize when we are talking about multiple pages (ie a previous years entire website).
In my opinion, the subdomain route seems like the best option. It will allow for the previous pages and content to contribute to the current sites domain authority while also serving as an extremely organized presentation of the previous years events and attractions!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlink from same domain but different subdomain? any juice here?
will i be able to get the link juice from same domain but different subdomain, if I have a backlink lets say there is a website, which is featuring my topic on multiple subdomain any benefit? or it will considered one link?
Intermediate & Advanced SEO | | SIMON-CULL0 -
Does the coding of a website really what matter for SEO
Hello, Just wondering if a site that is not coded correctly can hurt your seo even though as a human I can totally understand what is going on on the page and the structure of the website. Thanks,
Intermediate & Advanced SEO | | seoanalytics0 -
Subdomain replaced domain in Google SERP
Good morning, This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase
Intermediate & Advanced SEO | | chiplab0 -
Why my website disappears for the keywords ranked, then reappears and so on?
Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.
Intermediate & Advanced SEO | | emarketer0 -
Google does not favour php websites?
Hi there. An SEO company recently told me that google does not favour php development? This seems rather sketchy, I have not read that google doesn't favour this anywhere, did I just miss that part of SEO or are these guys blowing a little smoke?
Intermediate & Advanced SEO | | ProsperoDigital1 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
On-Site Directory - Delete or Keep?
We have 2 ecommerce sites. Both have been hit by Penguin (no warnings in WMT) and we're in the process of cleaning up backlinks. We have link directories on both sites. They've got links that are relevant to the sites but also links that aren't relevant. And they're big directories - we're talking thousands of links to other sites. What's the best approach here? Do we leave it alone, delete the whole thing, or manually review and keep highly relevant links but get rid of the rest?
Intermediate & Advanced SEO | | Kingof50 -
Disavow Subdomain?
Hi all, I've been checking and it seems like there are only 2 options when disavowing links with Google's tool. Disavow the link: http://spam.example.com/stuff/content.htm Disavow the domain: domain: example.com What can I do if I want do disavow a subdomain? i.e. spam.site.com I'm also assuming that if I were to disavow the domain it would include all subdomains? Thanks.
Intermediate & Advanced SEO | | Carlos-R0