Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Local SEO Over Optimization
We are targeting a bunch of services for our local business that works in and around their location. I'm concerned about over optimization and need some guidance on whether these points should be resolved. The company is based in a city and works mostly in the city but also in the surrounding areas. Currently, the site has 6 services pages (accessible via main nav) targeting the same location i.e. “Made Up Service London”, “Imaginary Service London” (with URLs and H1 tags etc. in place containing this location). However this is soon going to become 9 services pages, I am concerned that the repetition of this one location is starting to look spammy, especially as its where the company is based. Initially, I also wanted pages targeting the same services in other nearby areas. For example “Made Up Service Surrey”, “Imaginary Service Essex”. This has not happened as the info available has been too sporadic. I was going to add links to relevant case studies into these pages to beef up the content and add interest. To that end, we came up with case studies, but after a while, I noticed that these are also largely focused on the primary location. So out of 32 case studies, we have 19 focused on the primary location again with URL’s and H1 tags etc containing the location keyword. So in total, we have 25 pages optimized for the location (soon to be 28 and more if further case studies are added). My initial feeling was that the inclusion of pages targeting services in other locations would legitimize what we have done with the main pages. But obviously we have not got these pages in place and I question whether we ever will. What is my best course of action moving forward?
Local Website Optimization | | GrouchyKids1 -
How many SEO clients do you handle?
I work in a small web & design agency who started offering SEO 2 yrs ago as it made sense due to them building websites. There have been 2 previous people to me and I now work there 3 days a week and they also have a junior who knew nothing before she started working for us. She mainly works for me. My question is, how many clients do you think would be reasonable to work on? We currently have around 55 and I have been working there for nearly 5 months now and haven't even got to half of the sites to do some work on. I've told them the client list is way too big and we should only have around 15 clients max. However they don't want to lose the money from the already paying clients so won't get rid of any and keep adding new ones Their systems were a mess and had no reporting or useful software so I had to investiagte and deploy that, along with project management software. Their analytics is also a mess and have employed a contractor to help sort that out too. It's like they were offering SEO services but had no idea or structure to what they did. Meta descriptions were cherry picked which ones to be done, so say 50/60 on a site not filled in. So it's not like I have 45 or so well maintained accounts. They're all a mess. Then the latest 10 new ones are all new sites so All need a lot of work. I'm starting to feel incredibly overwhelmed and oppressed by it all and wanted to see what other SEO professionals thought about it. Any thoughts would be appreciated.
Local Website Optimization | | hanamck0 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Sub domain for geo pages
Hello Group! I have been tossing the idea in my head of using sub domains for the geo pages for each of my clients. For example: one of my clients is a lawyer in a very competitive Atlanta market http://bestdefensega.com. Can I set his geo page to woodstock.bestdefensega.com? Is this a viable option? Will I get penalized? Thoughts or suggestions always appreciated! Thanks in Advance
Local Website Optimization | | underdogmike0 -
Does building multiple websites hurt you seo wise? Good or bad strategy?
HI,rategy. So I spoke to a local Colorado seo company and they suggested to find whatever keywords is the most searched under my GWT's and put .com behind it and build other sites for other keywords. I was curious about this type of strategy. Does this work? This seo guy said I could just get a DBA bank account and such for each domain name etc. I am not wanting to mislead anyone, but I am curious if for the sake of promoting other services, if creating other websites with partial and EMD's are worthwhile? Another issue I worry about is if I put my companies phone number, then next thing you know there is 3 or 4 sites that use that same phone number. To me this does not build trust with Google. But being I am learning, maybe this is a common strategy, or doomed from the start. Just curious what you think. Would you build other sites to try and rank for other services? Or keep one sites and maximize it? Thank you for your thoughts. I just do not want to pay $3000 per site if it will hurt not help.
Local Website Optimization | | Berner0 -
How Best to do implement a Branch Locator for a Website with invididual location category pages
Hi All, We have an ecommerce Website with multiple locations for our stores and we currently display separate location specific pages for the different categories and sub categories. This has helped us previously to rank well for local search in each of the areas we have a store but over the last few months since humingbird, our local rankings on some things have dip a little . We want to implement a branch locator of some description to improve the user experience. From looking at other websites with branch locators, they tend to a separate button/page with which you can search for a branch etc. However, they don't have location specific pages. My query is should I do it so if a user comes in on a specific category location page and follows it through to product page , then to have a tab on the product page displaying the local branch from which he can come in. My thinking here is that , is that it would help confirm my local citations and help improve local rankings. Or Should the local branch be displayed on the local category pages instead or as well ?. If a user comes in from the homepage or not on a specific location page, then the branch locator will allow them to search for a specific branch. Should I also put in a branch locator as a separate page or can It be in more places. I don't want to damage anything which may have an effect on rankings due to citations and NAP on the location specific pages. Any advice or good examples to look at would be greatly appreciated thanks Sarah.
Local Website Optimization | | SarahCollins1