Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to handle annual events on a website?
-
Every year our company has a user conference with between 300 - 400 attendees. I've just begun giving the event more of a presence on our website. I'm wondering, what is the best way to handle highlights from previous years? Would it be to create an archive (e.g. www.companyname.com/eventname/2015) while constantly updating the main landing page to promote the current event? We also use an event website (cvent) to handle our registrations. So once we have an agenda for the current years event I do a temporary redirect from the main landing page to the registration website. I don't really like this practice and I feel like it might be better to keep all of the info on the main domain. Wondering if anybody has any opinions or feedback on that process as well.
Just looking for best practices or what others have done and have had success with.
-
Thank you Paul,
This is exactly what I needed. I've been trying to push us in this direction but it's sometimes hard to break old habits. We might even be able to save a bit of money going this route.
Thank again for the input!
-Brandon
-
Yup, there are many ways to do it, but it's vastly superior (I'd say critical) in terms of ongoing ranking and traffic generation to keep the primary content on a consistent URL the main domain. It's also vastly better for user experience as well.
Ad EGOL recommends, definitely keep the URL on your own site consistent from year to year. Create the archives of each year's highlights as children of the primary page, and make sure you are linking to the current year's page from each of the year-archive pages. This give Search Engines a clear understanding of the relative hierarchy and currency of the pages.
Do NOT 302 this page to the registration page. Simply add a call-to-action on the primary page to the registration page. You must have lots of good conference-related content on the primary page, not just a thin paragraph and a link to the reg page. You'll then want - if at all possible - to have the reg page (especially after successful registration) to redirect the visitor back to the primary page to give the followup info after registration.
Ideally, you'll want to be able to insert your Analytics tracking code on the reg site as well, and then configure cross-domain tracking for it. That way you can easily track conversions. At the very least, if you can't set up your own Analytics on the reg page, add a referral exclusion for it so the visitor coming back from the reg page doesn't show as a new visit on the primary page. You can then add conversion tracking to that return page.
These recommendations come from a background managing sites running up to 425 events per year, often with ticketing handled on a third-party site.
Hope that all makes sense?
Paul
-
So it sounds like you create an archive for the previous years event? Moving the previous years highlights to another page so it can still be accessed?
The url on our registration site changes but we redirect the main landing page to that url temporary. I think ideally the content for the event should be on the main domain and just utilize the third part event site to manage the registrations.
Seems like there are so many ways to do this.
-
We have a website with a page that links to events in our industry.
Most of the events have a single homepage that is updated every year. These homepages have a description of the upcoming event and links to agendas, registration, lodging, sponsors, speakers, exhibitors, past year highlights, etc. If you do this your search engine visibility will develop over time because almost everyone that links to your event will link to this single page year after year, for all of their websites, and every time they mention the event over time. Also, repeat visitors will be familiar and getting information, registering and finding lodging is "just like they did last year".
However, other events change the URL and everything else every year. This is a really bad idea because employees at businesses like mine, who link to events, will be snarling when they see that you have changed the URL again and must go on a treasure hunt to find it. Potential attendees will have trouble finding your event too. We have stopped linking to some of these events because finding the new pages, updating the links, and editing information is too demanding of employee time. We have not deleted a lot of events. Just the ones that are pain in the butt. When they get in touch with us to complain we tell them, let us know when you are done playing musical URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Website organic traffic unchanged, impressions took a 98% drop in the last week.
Hi all, I have a very curious predicament and I'd be grateful if someone could shed some light on the situation. As mentioned in the title, organic traffic to our website has remained unchanged, but organic impressions have taken a 98% drop in the last week. This happened suddenly over one day; on October 22, impressions were 700, on October 23, they were 500, and on October 24 they drastically dropped to 50. The next two days they were at 22 and then up to 35. Organic traffic, however, showed the normal "weekend drop" as of October 24, and is still showing normal level (even increased a bit) continuing into this week. These are organic impressions according to Google Analytics and Google Webmaster tools. We did perform a complete site redesign a month ago. Could this be an effect from the redesign? We also noticed drop in Domain Authority, but our competitors suffered a similar (if not greater) drop as well, so we wondered if it could be due in part to the algorithm update. If anyone could shed some light on the situation I would be so appreciative! Thanks!
Web Design | | Joanne_Pendon0 -
Is there a way to redirect URLs with a hash-bang (#!) format?
Hi Moz, I'm trying to redirect www.site.com/locations/#!city to www.site.com/locations/city. This seems difficult because anything after the hash character in the URL does not make it to the server thus cannot be parsed for rewriting. Is there an SEO friendly way to implement these redirects? Thanks for reading!
Web Design | | DA20130 -
Best SEO practice - Umbrella brand with several domains
Hi, we have several blogs and comparison sites on specific topics. All the domains rank on top positions in very competitive niche markets. We think that we can get more profit out of the domains when we put them under an umbrella brand. Customers that visit domain A can then also find products easily on domain B. We see this for example on health.com, with several brands in the top. To maintain or improve our rankings i'm looking for specific information for the link structure. For example, is it better to have the 'about us'/rel=author on each domain, with contributors on that specific domain or is it better to have them all in the (umbrella) brand domain. At the moment we have the structure like this: domainA.com, domainA.com/blog, domainA.com/about-us and domainB.com, domainB.com/blog, domainB.com/about-us. I think to maintain the rankings it is best to keep specific content (like blog/ about us) on the domain. So is it the best to just do side wide links with a logo (like health.com) and what about hosting? We work with wordpress, so all domains will be hosted on one ip? when we use the multiple site option of WP? All information on this topic is more than welcome 🙂
Web Design | | remkoallertz0 -
Subdomains For Real Estate Website
I am currently working on a proposal for a clients Wordpress website development which includes ongoing SEO after the website is developed. I have looked into a number of options and the one that seems the most cost effective involves using subdomains for the individual listings pages. What I want: clientsdomain.com/listings/idxnumber/ What I can get for a decent price: listings.clientsdomain.com/idxnumber/ So the majority of the website will actually exist on a subdomain because the IDX API will automatically populate pages for all of the MLS listings in the area (hundreds or thousands). Meanwhile the domain itself will have all the neighborhood pages and other optimized content, blogs and whatnot. My concern is that dividing the website like this will have negative effects on SEO. There wont be duplicate content across subdomain and main domain, but they will share a lot of links back and forth. I haven't found any recent sources on the topic. Almost everything I have found says that dividing a website in this manor is bad for SEO, but these articles are often many years old. Does anyone know of a Wordpress plugin/IDX company that can provide a solution that doesn't use a subdomain and actually just lists each MLS page within a directory? I am open to using another platform, I am just most familiar with Wordpress. Will using a subdomain in the ways mentioned above have a profound negative effect on SEO? Thank you for your time in responding, I greatly appreciate it.
Web Design | | TotalMarketExposure0 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0