Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
-
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Is there a better way to do what I'm trying to do?
-
I think people panic about cloaking/dynamic content too much to be honest.
It would be easy to go overboard and start alarm bells ringing, but if you have a dynamic area on a well structured and balanced page I can't see it being an issue.
Caveat: I can't think of a clear comparison to something I have worked on in terms of serving it geographically. However I've done similar based on countless other criteria and not felt it has harmed anything.
-
I am actually not really using this for SEO ranking purposes, although that might not be a bad side-effect.
I am using it to serve different content to different geographic locations. e.g. displaying the correct regional sales managers for the correct locations etc.
Do you think that placing dynamic content based on location on the homepage might give the googlebots a false cloaking? That wouldn't be too good.
-
Tricky this, as it depends on how it is being used.
Plenty of sites include dynamic content that will differ to different users. This can be for a number of legitimate reasons including serving different geographic content. If you are targeting general (non geographic) terms and every version of that page is serving those phrases well there should be no harm.
However, if the aim is to rank for [keyword placename] type searches and use the geographic targeting to do that then that is unlikely to work. If that were the aim you would probably be better served by having distinct pages for the pages and using the clienlocation API to direct users towards the most relevant one for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Problem with Yoast not seeing any of this website's text/content
Hi, My client has a new WordPress site http://www.londonavsolutions.co.uk/ and they have installed the Yoast Premium SEO plug-in. They are having issues with getting the lights to go green and the main problem is that on most pages Yoast does not see any words/content – although there are plenty of words on the pages. Other tools can see the words, however Yoast is struggling to find any and gives the following message:- Bad SEO score. The text contains 0 words. This is far below the recommended minimum of 300 words. Add more content that is relevant for the topic. Readability - You have far too little content. Please add some content to enable a good analysis. They have contacted the website developer who says that there is nothing wrong, but they are frustrated that they cannot use the Yoast tools themselves because of this issue, plus Yoast are offering no support with the issue. I hope that one of you guys has seen this problem before, or can spot a problem with the way the site has been built and can perhaps shed some light on the problem. I didn't build the site myself so won't be offended if you spot problems with it. Thanks in advance, Ben
Technical SEO | | bendyman0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
Technical SEO | | skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
Using RewriteRule - SEO Implications
Hi There, My client has a website (www.activeadventures.com) which they relaunched in April 2013. The company sells inbound tourism trips to New Zealand, South America and the Himalayas. Previously, the websites for these destinations were on their own domains (activenewzealand.com, activehimalayas.com, activesouthamerica.com). With the launch of the new website those domains were all retired (but had 301 redirects put into place to the new site), and moved into sub directories of the activeadventures.com domain (eg: activeadventures.com/new-zealand). There has been no indication that this strategy has improved organic search results (based on analytics) and in my opinion I believe that having this structure has been detrimental to their results. My opinion is based off the following: Visitors to the websites are coming into the site with a specific destination in mind that they want to travel to. Thus... having the destination in the URL I believe provides more immediate relevancy and should result in a higher CTR. I also feel that having the sites on their own URL's will provide a more concentrated theme for the destination based search phrases. The new site is a custom Joomla build and I want to find the easiest way to keep the current Joomla set up AND move the country specific sections of the site back onto their original URL's. It seems on the face of it that the easiest way to get this done is to use the htaccess file and use "RewriteRule" to push all the relevant pages back onto their original domains. Obviously we will ensure we also cover off pointing the existing 301's from the new site and the old sites to this new structure. My question is, are their any potential negative SEO implications of using the RewriteRule in the htaccess file to achieve this? Many thanks in advance. Kind Regards
Technical SEO | | activenz
Conrad Cranfield0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
If I redirect my WordPress blog to my main site, will it help my main site's SEO?
I have separate sites for my blog and main website. I'd like to link them in a way that enables the blog to boost my main site's SEO. Is there an easy way to do this? Thanks in advance for any advice...
Technical SEO | | matt-145670 -
Issue with Joomla Site not showing in SERP's
Site: simpsonelectricnc dot com I'm working on a Joomla website for a local business that isn't ranking at all for any relevant keyword - including the business name. The site is only about six months old and has relatively few links. I realize it takes time to compete for even low-volume keywords, but I think something else may be preventing the site from showing up. The site is not blocked by Robots.txt (which includes a valid reference to the sitemap)
Technical SEO | | CGR-Creative
There is no duplicate content issue, the .htaccess is redirecting all non-www traffic to www version
Every page has a unique title and H1 tag.
The URL's are search-engine friendly (not dynamic either)
XML sitemap is live and submitted to Google WM Tools. Google shows that it is indexing about 70% of the submitted URL's. The site has essentially no domain authority (0.02) according to Moz - I'm assuming this is due to lack of links and short life on the web.
Until today, 98% of the pages had identical meta descriptions. Again, I realize six months is not an eternity - but the site will not even show up for "business name + city,state" searches in Google. In fact, the only way I've seen it in organic results is to search for the exact URL. I would greatly appreciate any help.0