Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using geolocation for dynamic content - what's the best practice for SEO?
-
Hello
We sell a product globally but I want to use different keywords to describe the product based on location.
For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name).
What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches).
I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5.
I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada.
I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript.
Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though.
Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck.
Thank you so much!
Laura
Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
-
Hello Benjamin,
This is an interesting problem. I'm going to provide my opinion, but I highly recommend studying up on International SEO, which you can do at the links below:
https://moz.com/learn/seo/international-seo
http://www.aleydasolis.com/en/I don't know what the plugin does, but if it generates a new URL (e.g. adds ?loc=ca or something like that) for the location change you'll want to use rel="alternate" hreflang="*" tags, which would look something like this:
Google recommends putting one language per page, so that would be a different URL for each version, as highlighted in red here.
**However, it sounds to me like all of this is done client-side using JavaScript, and that the URL doesn't change, only the content.** If this is the case - as long as you are serving the same content to Googlebot crawling in Canada as you are to a visitor in Canada you probably won't have any issues, as described here: https://webmasters.googleblog.com/2008/06/how-google-defines-ip-delivery.html .
For the situation you described, it seems like you could put both keyword variations in the content and that would be good enough. But then you don't want to spell specialising with an S in one line and specializing with a z in the next.
Another thing to look into is whether both versions of the content appear in the code, or just one or the other. You definitely don't want to have multiple versions of the content in the source code. But you also don't want to hide both versions via JavaScript, only to load one or the other client-side. That creates even more problems.
One would think there would be a Vary: Location response header, similar to how responses are provided when content varies by user-agent or cookie: https://www.fastly.com/blog/best-practices-for-using-the-vary-header . Alas, I can't find any use cases of this and it's not a "thing". I'm not sure why this is, but maybe an International SEO Expert like Alayda Solis would know. I'll ping her into the thread if she has time.
-
Thank you. That link was helpful.
-
This is a great opportunity to test some of your ideas. It may be a good idea to create unique landing pages based on the most highly search keyword per region and target them on the corresponding geo-page. Read more about great ways to rank geo-targeted pages--and make them convert: https://moz.com/blog/scaling-geo-targeted-local-landing-pages-that-really-rank-and-convert-whiteboard-friday
However, it may be agood idea to optimize the homepage, about page, and service description pages for the products rather than the locations.And, since you're a national brand, it may be smart to try so PPC ads to geo-target your advertising and use those keywords accordingly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which are the best off-page SEO techniques for 2020?
I have just published an awesome website or blog, and i really worked hard keeping everything perfect. Do you think it’s enough? Having a perfect blog, website or business is just enough. i need readers for my blog, visitors to my website, and customers for my business. So, what to do?
Local Website Optimization | | boxinghunter0 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Improving SEO with no blog
I have a client who understands the value of content for SEO - however getting them to provide some content has proven an impossible task. I've tried every way to make it easy for them. I've offered to come over to their office myself and see if I can just take 15 minutes of their time and record their answers to a few questions. The response is that's a great idea, we'll set up a time...and no time is ever good. So I've thought, what can I do without them? Unfortunately, their industry is so technical and so niche I'd need to have a law degree to even begin to understand exactly what they do, and as they are in law it's probably better to have no content than content with something even slightly incorrect in it. For now, all I can do is summarize and share news from a government website to their social media accounts. It's not highly effective. Their on-page SEO for the main site is completely optimized. I've placed them in every free listing I can possibly find - both industry and local sites. I have them update me on any local events, conferences and/or trade shows they attend for possible backlinks. What else can I do? I suppose I fear that if I can't provide them any additional results, they will stop seeing the value in SEO services, and I'd have a hard time disagreeing as I can't think of what else to do for them. Thanks for any help!
Local Website Optimization | | everestagency1 -
Local SEO for National Brands
Hi all, When it comes to local SEO in 2015, I appreciate that having a physical location in the town/city you wish to rank is a major factor. However, if you're a national brand is it still possible to rank for local searches when you're based in one location? The reason I ask is that, although our service is national, the nature of what we offer means that it is not inconceivable that people would search for a local variation of our top keywords. Other than the standard things - location in the content, the H1/H2s, title tag, meta description, url etc. - is there anything national businesses can do to help? Thanks in advance. John
Local Website Optimization | | NAHL-14300 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
How Best to do implement a Branch Locator for a Website with invididual location category pages
Hi All, We have an ecommerce Website with multiple locations for our stores and we currently display separate location specific pages for the different categories and sub categories. This has helped us previously to rank well for local search in each of the areas we have a store but over the last few months since humingbird, our local rankings on some things have dip a little . We want to implement a branch locator of some description to improve the user experience. From looking at other websites with branch locators, they tend to a separate button/page with which you can search for a branch etc. However, they don't have location specific pages. My query is should I do it so if a user comes in on a specific category location page and follows it through to product page , then to have a tab on the product page displaying the local branch from which he can come in. My thinking here is that , is that it would help confirm my local citations and help improve local rankings. Or Should the local branch be displayed on the local category pages instead or as well ?. If a user comes in from the homepage or not on a specific location page, then the branch locator will allow them to search for a specific branch. Should I also put in a branch locator as a separate page or can It be in more places. I don't want to damage anything which may have an effect on rankings due to citations and NAP on the location specific pages. Any advice or good examples to look at would be greatly appreciated thanks Sarah.
Local Website Optimization | | SarahCollins1 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0