Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Web-site Structure/ SEO Strategy for an online travel agency?
-
Dear Experts!
I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content.
In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content).
From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are:
- all properties (includes all property types and locations),
- a page for each location (includes all property types).
Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords...
Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property?
If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible?
Your feedback will be highly appreciated!
Thank you!
Dmitry
-
For an online travel agency, a robust website structure and SEO strategy are vital. Implement a user-friendly interface with intuitive navigation, making it easy for visitors to search and book travel options. Optimize website content with relevant keywords, meta tags, and descriptive URLs to improve search engine visibility. Incorporate high-quality images, engaging travel guides, and customer reviews to enhance user experience and encourage longer site engagement. Utilize responsive design for seamless browsing across devices, and prioritize mobile optimization for on-the-go travelers.
-
For an online travel agency, a robust website structure and SEO strategy are vital. Implement a user-friendly interface with intuitive navigation, making it easy for visitors to search and book travel options. Optimize website content with relevant keywords, meta tags, and descriptive URLs to improve search engine visibility. Incorporate high-quality images, engaging travel guides, and customer reviews to enhance user experience and encourage longer site engagement. Utilize responsive design for seamless browsing across devices, and prioritize mobile optimization for on-the-go travelers.
-
This structure and strategy will help your online travel agency stand out in a competitive market and provide a superior experience for your customers.
Below is a response to the query about the best website architecture, SEO strategies and tactics for an electronic tourism company:
- Website Structure:
• Homepage: Introduce yourself and your agency, express your main services and provide easily navigable site.
• Destinations: Each separate page is meant for the travel destination you provide with rich content and appealing pictures.
• Tours/Packages: Make sure there is a specific tour/package section with prices, programs of trips and their status.
• For booking or contacting us, provide an interactive booking form that is easy to use together with several contact options(phone, chat, and email) to enable for queries or help from customers.
• The blog should have few words but should contain travelling tips, tourist information from selected destination points and what is happening at your agency, hence it should engage every visitor to improve Search Engine Optimization (SEO).
• Trust is created by showing customer reviews and testimonials giving them confidence in your services.
• In order to enrich the lives of your clients by forming friendships as they worked with us personally, you need to tell them more about yourself as well as where you have come from in terms of history and what you want us accomplish as a group.- SEO Strategy:
• Key phrase method is the first thing you must do in SEO. Start by recognizing all those phrases travelers are using to search for travel services or products.
• Optimize title tags, meta descriptions, headers, and alt text for images with targeted keywords to assist in on-page optimization.
• Your blog should always have high-quality and informative content that incorporates target keywords for effective content creation.
• Strategic linking between pages and posts on your site will improve navigation as well as search engine optimization.
• To boost your site's authority, you should generate quality backlinks from renowned travel-related websites.
• For better user experience and ranking purposes, you must optimize your website for mobile devices.In a competitive marketplace, this structure and strategy can make your online travel site unique while at the same time offering an enhanced experience for clients.
Yours truly,
[Sanskar Gupta]
B Two Holidays -
Hello,
We at Donutz Digital digital marketing agency have some travel niche clients, so I believe we can help you or others in a similar situation.
Why don't you send us an inquiry directly, and we will answer asap with possible options and maybe an offer so you could have your hands free on similar technical tasks and focus and the ones you feel more comfortable with?
-
I am working on my website(https://www.dejourneys.com) , and found that some of the websites like yelp.com and other similar ones required a USA number and address.
How can I get a strong link from those websites and are there any other ones that can help me get strong backlinks for my travel agency?
Regards,
Raheel. -
Hi,
Cool question! I previously ran a startup that was essentially an aggregator, something similar to an OTA, but we were aggregating classes instead of properties/homestays. I found that the best way to structure the site was some thing like this:
1. Home (Targeting the biggest, baddest keyword you can find)
https://qualistay.com/1.2 Category pages
Broad keywords in each category (in your case, 'tenerife south apartments for rent' etc)
You currently have this as https://qualistay.com/properties/tenerife/
I'd have gone with creating multiple 'category' pages like
https://qualistay.com/tenerife-south/apartments
https://qualistay.com/tenerife-south/villas
https://qualistay.com/tenerife-north/apartments
https://qualistay.com/tenerife-north/villas1.2.1 Sub-Category pages
Still relatively broad, but more specific keywords
You didn't choose to sub-categorize these pages even more, but here's what I would have done:
https://qualistay.com/tenerife-south/apartments/adeje
https://qualistay.com/tenerife-south/villas/adeje
https://qualistay.com/tenerife-south/apartments/arico
https://qualistay.com/tenerife-south/villas/arico
https://qualistay.com/tenerife-south/apartments/granadilla-de-abona1.2.1.1 Property pages
Specific keywords
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria
These pages would tend be targeting the so-called 'brand keyword' of each individual property.Structuring your site this was enables you to include the targeted keywords in your URLs and enables you to rank almost every single page efficiently based purely on the location of each property. In this manner, you would be able to rank for the top tier keywords which I'm guessing is 'tenerife villas' and 'tenerife apartments', the 2nd tier keywords which would be 'tenerife south villas for rent', 'tenerife south apartments for rent' and the 3rd tier keywords which would be 'playa de las americas villas for rent'. You also get the benefit of ranking for each individual property's 'brand name' like 'villa victora tenerife south'.
If the property happens to fall on the same building, then you can sub-categorize it even further like
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria/level-1
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria/level-2Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
SEO impact of the anatomy of URL subdirectory structure?
I've been pushing hard to get our Americas site (DA 34) integrated with our higher domain authority (DA 51) international website. Currently our international website is setup in the following format... website.com/us-en/ website.com/fr-fr/ etc... The problem that I am facing is that I need my development framework installed in it's own directory. It cannot be at the root of the website (website.com) since that is where the other websites (us-en, fr-fr, etc.) are being generated from. Though we will have control of /us-en/ after the integration I cannot use that as the website main directory since the americas website is going to be designed for scalability (eventually adopting all regions and languages) so it cannot be region specific. What we're looking at is website.com/[base]/us-en. I'm afraid that if base has any length to it in terms of characters it is going to dilute the SEO value of whatever comes after it in the URL (website.com/[base]/us-en/store/product-name.html). Any recommendations?
Technical SEO | | bearpaw0 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Image Height/Width attributes, how important are they and should a best practice site include this as std
Hi How important are the image height/width attributes and would you expect a best practice site to have them included ? I hear not having them can slow down a page load time is that correct ? Any other issues from not having them ? I know some re social sharing (i know bufferapp prefers images with h/w attributes to draw into their selection of image options when you post) Most importantly though would you expect them to be intrinsic to sites that have been designed according to best practice guidelines ? Thanks
Technical SEO | | Dan-Lawrence0 -
Are links in menus to external sites bad for SEO?
We're building a blog on a subdomain of the main site. The main site is on Shopify and the blog will be on wordpress. I'd like to keep the user experience as simple as possible so I'd like to make the blog look exactly like the main Shopify site. This means having a menu in the blog that duplicates the Shopify menu. So is it bad for SEO to have someone click on the 'about us' button in the blog subdomain (blog.mainsite.com) which takes you to the 'about us page' on the main shopify website (mainsite.com)?
Technical SEO | | acs1110 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0