Using geolocation for dynamic content - what's the best practice for SEO?
-
Hello
We sell a product globally but I want to use different keywords to describe the product based on location.
For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name).
What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches).
I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5.
I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada.
I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript.
Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though.
Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck.
Thank you so much!
Laura
Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
-
Hello Benjamin,
This is an interesting problem. I'm going to provide my opinion, but I highly recommend studying up on International SEO, which you can do at the links below:
https://moz.com/learn/seo/international-seo
http://www.aleydasolis.com/en/I don't know what the plugin does, but if it generates a new URL (e.g. adds ?loc=ca or something like that) for the location change you'll want to use rel="alternate" hreflang="*" tags, which would look something like this:
Google recommends putting one language per page, so that would be a different URL for each version, as highlighted in red here.
**However, it sounds to me like all of this is done client-side using JavaScript, and that the URL doesn't change, only the content.** If this is the case - as long as you are serving the same content to Googlebot crawling in Canada as you are to a visitor in Canada you probably won't have any issues, as described here: https://webmasters.googleblog.com/2008/06/how-google-defines-ip-delivery.html .
For the situation you described, it seems like you could put both keyword variations in the content and that would be good enough. But then you don't want to spell specialising with an S in one line and specializing with a z in the next.
Another thing to look into is whether both versions of the content appear in the code, or just one or the other. You definitely don't want to have multiple versions of the content in the source code. But you also don't want to hide both versions via JavaScript, only to load one or the other client-side. That creates even more problems.
One would think there would be a Vary: Location response header, similar to how responses are provided when content varies by user-agent or cookie: https://www.fastly.com/blog/best-practices-for-using-the-vary-header . Alas, I can't find any use cases of this and it's not a "thing". I'm not sure why this is, but maybe an International SEO Expert like Alayda Solis would know. I'll ping her into the thread if she has time.
-
Thank you. That link was helpful.
-
This is a great opportunity to test some of your ideas. It may be a good idea to create unique landing pages based on the most highly search keyword per region and target them on the corresponding geo-page. Read more about great ways to rank geo-targeted pages--and make them convert: https://moz.com/blog/scaling-geo-targeted-local-landing-pages-that-really-rank-and-convert-whiteboard-friday
However, it may be agood idea to optimize the homepage, about page, and service description pages for the products rather than the locations.And, since you're a national brand, it may be smart to try so PPC ads to geo-target your advertising and use those keywords accordingly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best SEO solution for international targeting of different english speaking countries?
Hi guys, recently won a client who operates globally, their domain is .com and their head office is in the UK. They have built regional sub-directories and translated content and pages of their site for /ru, /fr etc. The issue comes with their /us and /ca pages. This content for the most part is identical to the main .com site. The content is still in English and can't in most situations be changed to be more localised, so there are duplicate content issues. Trying to think of options: Ensure hreflang is added properly, build regional links to regional pages, get local contact details / NAP on all regional pages, set up Google business listings for each regional office and link accordingly. Will Google be able to identify these regional pages as more suitable search results for US searches? Make the main .com version of the content the canonical, which takes away any ranking benefits of the regional pages altogether, but removes the duplicate content issues and means we can focus link building and content resources into making sure the .com version of the page ranks well. Thanks!
Local Website Optimization | | SamFanatic0 -
Impact of .us vs .com on SEO rankings?
Our website is hosted on www.discovered.us. I have 2 questions: 1: we have had regular feedback a .us domain is negative in SEO and in conversion (customers don't like it). We are thinking of changing domain to: www.dscvrd.com.
Local Website Optimization | | Discovered
Any insights on the impact on our rankings (if any) if we do this? 2: we are focusing our SEO global / USA first but conversions in UK are better. We currently do not have multi-language SEO setup. What would the impact be of implementing www.discovered.co.uk on SEO in UK? Thanks! Gijsbert0 -
Multi Location SEO Page Structure
I am trying to optimize my website for multiple locations. I have setup a landing page for each location. Now I want to optimize services we offer at those locations such as floor scrubber rentals. I'm confused on the best approach for this for ranking locally. I offer the same equipment for rent at each location. So... should I have a link on the location landing page that takes you to an individual floor scrubber rental page for each location optimized for that locations city or should I have just one floor scrubber rental page and would I optimize it for both cities or just optimize it for floor scrubber rentals in general? I have many different categories like this that are offered @ both locations. If I do individual pages all the products and rates will be duplicate but I could change the areas we deliver to and description to be more geared towards that city.
Local Website Optimization | | CougarChemMike0 -
Using IP Detection to Filter Directory Listings without Killing Your SEO?
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible? Thanks! Jeremy
Local Website Optimization | | Jeremy_Lopatin0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
RE: Keep Losing Keyword Ranking Position for Targeted Keyword Terms Can't Figure It Out, Please Help!!!
Hey Mozzers, I am pulling my hair out trying to figure out why one of my clients keeps losing their SERP for their targeted keyword terms. We're actively pursuing local citations, making sure their NAP is consistent across the board and refining on-page content to make sure that we're maximizing opportunities. The only thing I've found is a 4xx error that my Moz 'crawl diagnostics' keep returning back to me, however, when I check to see if there's any problems with Google Webmaster Tools, it doesn't return any errors. Is this 4xx error the culprit? Are there any suggestions any of you could give me to help me improve the SERP for my targeted keyword terms. Anyway, any and all insight can help. I'm at my wits end. Thanks for reading and for all of your help!
Local Website Optimization | | maxcarnage0 -
How Google's Doorway Pages Update Affects Local SEO
Hey Awesome Local Folks! I thought I'd take a proactive stance and start a thread on the new doorway pages update from Google, as I feel there will be questions coming up about this here in the forum: Here's the update announcement: http://googlewebmastercentral.blogspot.com/2015/03/an-update-on-doorway-pages.html And here's the part that will make local business owners and Local SEOs take a second glance at this: Here are questions to ask of pages that could be seen as doorway pages: Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic? I think this will naturally lead to questions about the practice of creating local/city landing pages. At this point, my prediction is that this will come down to high quality vs. crummy quality pages of this type. In fact, after chatting briefly with Andrew Shotland, I'm leaning a bit toward seeing the above language as being strongly geared toward directory type sites and large franchises. I recommend reading Andrew's post about his take on this, as I think he's on the right track: http://www.localseoguide.com/googles-about-to-close-your-local-doorway-pages/ So, I'm feeling at this point that if you've made the right efforts to develop unique, high quality local landing pages, you should be good unless you are an accidental casualty of an over-zealous update. We'll see! If anyone has thoughts to contribute on this thread, I hope they will, and if lots of questions start coming up about this here in the community, feel free to link back to this thread in helping your fellow community members 🙂 Thanks, all!
Local Website Optimization | | MiriamEllis9 -
Duplicate Theme, different Content, competing keywords?
Hello, We have 2 versions of a website. Notes: They will be the same theme, and slightly different images but the written content is different. SEO optimization is the same for both sites targeting the same the city and they will be competing for certain keywords mainly vanity keywords. So we have websites examples:
Local Website Optimization | | EVERWORLD.ENTERTAIMENT
http://mycompanytor.com http://mycompanytoronto.com How does google handle 2 websites like this, will one get penalized? Or will it treated as 2 different sites, even though the company name which is the brand shows up on the main url? Thanks for your help0