Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO Over Optimization
We are targeting a bunch of services for our local business that works in and around their location. I'm concerned about over optimization and need some guidance on whether these points should be resolved. The company is based in a city and works mostly in the city but also in the surrounding areas. Currently, the site has 6 services pages (accessible via main nav) targeting the same location i.e. “Made Up Service London”, “Imaginary Service London” (with URLs and H1 tags etc. in place containing this location). However this is soon going to become 9 services pages, I am concerned that the repetition of this one location is starting to look spammy, especially as its where the company is based. Initially, I also wanted pages targeting the same services in other nearby areas. For example “Made Up Service Surrey”, “Imaginary Service Essex”. This has not happened as the info available has been too sporadic. I was going to add links to relevant case studies into these pages to beef up the content and add interest. To that end, we came up with case studies, but after a while, I noticed that these are also largely focused on the primary location. So out of 32 case studies, we have 19 focused on the primary location again with URL’s and H1 tags etc containing the location keyword. So in total, we have 25 pages optimized for the location (soon to be 28 and more if further case studies are added). My initial feeling was that the inclusion of pages targeting services in other locations would legitimize what we have done with the main pages. But obviously we have not got these pages in place and I question whether we ever will. What is my best course of action moving forward?
Local Website Optimization | | GrouchyKids1 -
Comparing oddities between two websites
I've been working on SEO for a local website for a few years, last year a new competitor has popped up and started kicking my tail. So I figured I would pose the question and see if someone could point me in a different more helpful direction. I'm mainly stumped as to why the newer competitor has a higher DA and PA when I look at his link profile and lack of website content. A few Comparison notes Competitor is using SSL on the entire site. I am not (Checkmark for him) Competitor has a spam score of 8 out of 17 I have a 1 out of 17 (Checkmark for me) Competitor has a DA of 18 - My DA is 17 (Checkmark for him) Competitor has a PA of 32 - My PA is 28 (Checkmark for him) My Established Link Domains 14 and 127 Links. Also the PA and DA of my external links are much better than the competitor. Competitor has Established Link Domains 9 and 30 Links Also the PA and DA of my external links are much better than the competitor. (Checkmark for me) I have created around 50 custom blog articles along with the pages on my site he has 0 and only 5 pages indexed by Google. (Checkmark for me) The major issue when I do a comparison on OSE is that he has more external links and external equity passing links even though the links are very low value.
Local Website Optimization | | SEO_Matt0 -
Schema training/resources for local SEO?
I am currently in the process of apply schema for dozens of clients (many are large retailers). Although I am not a developer, I do know the basics of schematic markup & structured data. I do work with a development team and I'm trying to provide them with schema application best practices. Obviously there are many good articles/blog posts out there about schema. However I'm looking for a more substantial training course, webinar or resource website about schema application. Does anybody have any good recommendations?
Local Website Optimization | | RosemaryB0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
.com vs .com/language ?
Hello Moooooooooz ! We're currently working on a new website http://www.globalmetal.fr/ which deep SEO issues. The problematic is as always in this case: 1 company + different subsidiaries + different markets + different languages The companies is handling different domains: http://www.globalmetal.fr/
Local Website Optimization | | JoomGeek
www.globalmetalbroker.ch
www.globalmetalbroker.com
and so on. Until recently I was totally convinced (there is no magic solution I know) that it was better for a SME to focus on 1 domain (.com) and get the other websites per language .com/fr .com/es etc. But in their case their TLD is pretty new: www.globalmetalbroker.com (DA 1) vs globalmetal.fr (DA 15) So I'm wondering: 1- Does Google know understand that globalmetal.fr is the french website of globalmetalbroker.com (maybe via webmaster tool) ?
2- Does it make senss to move all the (new) language websites into .com/[folders] and once the .com DA is doing better redirecting the .fr to.com/fr ?
3- Is it better to focus on .com .fr (but french speakers are not just in france) .ru and so on or to keep the .com/[languages] Hope someone got the same issue recently 😛0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Which internal page approach is better? Couponsite/Kohls OR Couponsite/Houston/Kohls
Google will use the user's location for a restaurant search but it doesn't look to me like it uses it for a national company like Kohls. Is there a way to determine that? Assume I have no physical local presence in Houston for answering the question. Assume also that the coupon I list is a national one that applies everywhere. It seems to me that a facebook post that uses the first one as a link is better because more people live outside of Houston than inside and will see it as relevant, AND I may list it for more than one city. But, for specificity perhaps it makes sense to have the second one as it may be more likely to show up in a Google search result by someone in Houston.. Your thoughts please? Thanks.
Local Website Optimization | | couponguy0 -
Local SEO Tools for UK
Hi guys I'm looking for any recommendations for local SEO tools in the UK? I keep stumbling across a variety of different tools but they all seem to cater for the US market only. Any tools or tips would be greatly received!
Local Website Optimization | | DHS_SH0