Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to handle annual events on a website?
-
Every year our company has a user conference with between 300 - 400 attendees. I've just begun giving the event more of a presence on our website. I'm wondering, what is the best way to handle highlights from previous years? Would it be to create an archive (e.g. www.companyname.com/eventname/2015) while constantly updating the main landing page to promote the current event? We also use an event website (cvent) to handle our registrations. So once we have an agenda for the current years event I do a temporary redirect from the main landing page to the registration website. I don't really like this practice and I feel like it might be better to keep all of the info on the main domain. Wondering if anybody has any opinions or feedback on that process as well.
Just looking for best practices or what others have done and have had success with.
-
Thank you Paul,
This is exactly what I needed. I've been trying to push us in this direction but it's sometimes hard to break old habits. We might even be able to save a bit of money going this route.
Thank again for the input!
-Brandon
-
Yup, there are many ways to do it, but it's vastly superior (I'd say critical) in terms of ongoing ranking and traffic generation to keep the primary content on a consistent URL the main domain. It's also vastly better for user experience as well.
Ad EGOL recommends, definitely keep the URL on your own site consistent from year to year. Create the archives of each year's highlights as children of the primary page, and make sure you are linking to the current year's page from each of the year-archive pages. This give Search Engines a clear understanding of the relative hierarchy and currency of the pages.
Do NOT 302 this page to the registration page. Simply add a call-to-action on the primary page to the registration page. You must have lots of good conference-related content on the primary page, not just a thin paragraph and a link to the reg page. You'll then want - if at all possible - to have the reg page (especially after successful registration) to redirect the visitor back to the primary page to give the followup info after registration.
Ideally, you'll want to be able to insert your Analytics tracking code on the reg site as well, and then configure cross-domain tracking for it. That way you can easily track conversions. At the very least, if you can't set up your own Analytics on the reg page, add a referral exclusion for it so the visitor coming back from the reg page doesn't show as a new visit on the primary page. You can then add conversion tracking to that return page.
These recommendations come from a background managing sites running up to 425 events per year, often with ticketing handled on a third-party site.
Hope that all makes sense?
Paul
-
So it sounds like you create an archive for the previous years event? Moving the previous years highlights to another page so it can still be accessed?
The url on our registration site changes but we redirect the main landing page to that url temporary. I think ideally the content for the event should be on the main domain and just utilize the third part event site to manage the registrations.
Seems like there are so many ways to do this.
-
We have a website with a page that links to events in our industry.
Most of the events have a single homepage that is updated every year. These homepages have a description of the upcoming event and links to agendas, registration, lodging, sponsors, speakers, exhibitors, past year highlights, etc. If you do this your search engine visibility will develop over time because almost everyone that links to your event will link to this single page year after year, for all of their websites, and every time they mention the event over time. Also, repeat visitors will be familiar and getting information, registering and finding lodging is "just like they did last year".
However, other events change the URL and everything else every year. This is a really bad idea because employees at businesses like mine, who link to events, will be snarling when they see that you have changed the URL again and must go on a treasure hunt to find it. Potential attendees will have trouble finding your event too. We have stopped linking to some of these events because finding the new pages, updating the links, and editing information is too demanding of employee time. We have not deleted a lot of events. Just the ones that are pain in the butt. When they get in touch with us to complain we tell them, let us know when you are done playing musical URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Layout on Initial Website Load?
Whenever I open my site from an uncached source, like google incognito, for a split second it displays purple links and a white background while it loads the rest of the content. I've included a screenshot. Is there any way to fix that.? The site is www.kemprugegreen.com. u8P9q
Web Design | | KempRugeLawGroup1 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Spanish website indexed in English, redirect to spanish or english version if i do a new website design?
Hi MOZ users, i have this problem. We have a website in Spanish Language but Google crawls it on English (it is not important the reasons). We re made the entire website and now we are planning the move. The new website will have different language versions, english, spanish and portuguese. Somebody tells me that we have to redirect the old urls (crawled on english) to the new english versions, not to the spanish (the real language of the firsts). Example: URL1 Language: Spanish - Crawled on English --> redirect to Language English version. the other option will be redirect to the spanish new version, which the visitor is waiting to find. URL1 Language: Spanish - Crawled on English --> redirect to Language Spanish version. What do you think? Which is the better option?
Web Design | | NachoRetta0 -
Is there a way to redirect URLs with a hash-bang (#!) format?
Hi Moz, I'm trying to redirect www.site.com/locations/#!city to www.site.com/locations/city. This seems difficult because anything after the hash character in the URL does not make it to the server thus cannot be parsed for rewriting. Is there an SEO friendly way to implement these redirects? Thanks for reading!
Web Design | | DA20130 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
Multi-page articles, pagination, best practice...
A couple months ago we mitigated a 12-year-old site -- about 2,000 pages -- to WordPress.
Web Design | | jmueller0823
The transition was smooth (301 redirects), we haven't lost much search juice. We have about 75 multi-page articles (posts); we're using a plugin (Organize Series) to manage the pagination. On the old site, all of the pages in the series had the same title. I've since heard this is not a good SEO practice (duplicate titles). The url's were the same too, with a 'number' (designating the page number) appended to the title text. Here's my question: 1. Is there a best practice for titles & url's of multi-page articles? Let's say we have an article named: 'This is an Article' ... What if I name the pages like this:
-- This is an Article, Page 1
-- This is an Article, Page 2
-- This is an Article, Page 3 Is that a good idea? Or, should each page have a completely different title? Does it matter?
** I think for usability, the examples above are best; they give the reader context. What about url's ? Are these a good idea? /this-is-an-article-01, /this-is-an-article-02, and so on...
Does it matter? 2. I've read that maybe multi-page articles are not such a good idea -- from usability and SEO standpoints. We tend to limit our articles to about 800 words per page. So, is it better to publish 'long' articles instead of multi-page? Does it matter? I think I'm seeing a trend on content sites toward long, one-page articles. 3. Any other gotchas we should be aware of, related to SEO/ multi-page? Long post... we've gone back-and-forth on this a couple times and need to get this settled.
Thanks much! Jim0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0