Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
-
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Is there a better way to do what I'm trying to do?
-
I think people panic about cloaking/dynamic content too much to be honest.
It would be easy to go overboard and start alarm bells ringing, but if you have a dynamic area on a well structured and balanced page I can't see it being an issue.
Caveat: I can't think of a clear comparison to something I have worked on in terms of serving it geographically. However I've done similar based on countless other criteria and not felt it has harmed anything.
-
I am actually not really using this for SEO ranking purposes, although that might not be a bad side-effect.
I am using it to serve different content to different geographic locations. e.g. displaying the correct regional sales managers for the correct locations etc.
Do you think that placing dynamic content based on location on the homepage might give the googlebots a false cloaking? That wouldn't be too good.
-
Tricky this, as it depends on how it is being used.
Plenty of sites include dynamic content that will differ to different users. This can be for a number of legitimate reasons including serving different geographic content. If you are targeting general (non geographic) terms and every version of that page is serving those phrases well there should be no harm.
However, if the aim is to rank for [keyword placename] type searches and use the geographic targeting to do that then that is unlikely to work. If that were the aim you would probably be better served by having distinct pages for the pages and using the clienlocation API to direct users towards the most relevant one for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an easy way to hide one of your URL's on google search?, rather than redirecting?
We don't want to redirect to a different page, as some people still use it, we just don't want it to appear in search
Technical SEO | | TheIDCo0 -
Thoughts on different base URLs for different website language?
Hello mozzers, Currently in the process of setting up a new website for a new entity. I was wondering what your thoughts were on using different base urls for different languages. Example: ABCgroup.com -> English
Technical SEO | | yacpro13
groupeABC.com -> French I've never done this before; I've been one to prefer using a subfolder structure. However, for this case, the expected visitors are truly split between 2 languages, and therefore having a base url in the visitor's language is appealing. Would this approach be detrimental if all pages have a proper hreflang tag? Thanks!0 -
What's with the redirects?
Hi there,
Technical SEO | | HeadStud
I have a strange issue where pages are redirecting to the homepage.Let me explain - my website is http://thedj.com.au Now when I type in www.thedj.com.au/payments it redirects to https://thedj.com.au (even though it should be going to the page https://thedj.com.au/payments). Any idea why this is and how to fix? My htaccess file is below: BEGIN HTTPS Redirection Plugin <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteRule ^home.htm$ https://thedj.com.au/ [R=301,L]
RewriteRule ^photos.htm$ http://photos.thedj.com.au/ [R=301,L]
RewriteRule ^contacts.htm$ https://thedj.com.au/contact-us/ [R=301,L]
RewriteRule ^booking.htm$ https://thedj.com.au/book-dj/ [R=301,L]
RewriteRule ^downloads.htm$ https://thedj.com.au/downloads/ [R=301,L]
RewriteRule ^payonline.htm$ https://thedj.com.au/payments/ [R=301,L]
RewriteRule ^price.htm$ https://thedj.com.au/pricing/ [R=301,L]
RewriteRule ^questions.htm$ https://thedj.com.au/faq/ [R=301,L]
RewriteRule ^links.htm$ https://thedj.com.au/links/ [R=301,L]
RewriteRule ^thankyous/index.htm$ https://thedj.com.au/testimonials/ [R=301,L]
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://thedj.com.au/ [L,R=301]</ifmodule> END HTTPS Redirection Plugin BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress RewriteCond %{HTTP_HOST} ^mrdj.net.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.mrdj.net.au$
RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^mrdj.com.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.mrdj.com.au$
RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^thedjs.com.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.thedjs.com.au$
RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^theperthweddingdjs.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.theperthweddingdjs.com$
RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^thedjs.net.au$ [OR]
RewriteCond %{HTTP_HOST} ^www.thedjs.net.au$
RewriteRule ^/?$ "https://thedj.com.au" [R=301,L]0 -
Using a Colo Load Balancer to serve content
So this is a little complicated (at least for me...) We have a client who is having us rebuild and optimize about 350 pages of their website in our CMS. However, the rest of the website will not be on our CMS. We wanted to build these pages on a sub-domain that is pointed to our IPs so it could remain on our CMS--which the client wants. However, they want the content on a sub-directory. This would be fine but they will not point the main domain to us and for whatever reason this becomes impossible per their Dev team. They have proposed using a Colo Load Balancer to deliver the content from our system (which will be on the sub domain) to their sub directory. This seems very sketchy to me. Possible duplicate content? Would this be a sort of URL masking? How would Google see this? Has anyone ever even heard of doing anything like this?
Technical SEO | | Vizergy0 -
On-Page Report Says 'F', and I'm Confoozled As to Why
I'm primarily interested in how we failed in our "Broad Keyword Usage in Title" category. The Keyword Pair we're gunnin' for is: "Mac Windows" Our current page title is: "CrossOver: Windows on Mac and Linux with the easiest and most affordable emulator - CodeWeavers" This is, I grant, ugly. However, bear with me. SEOMoz Report Card says "Easy Fix!" and suggests: "Employ the keyword in the page title, preferrably as the first words in the element." I humbly submit that "Mac" and "Windows" IS in the page title. So what am I missing? Is it the placement of the words relative to each other, or relative to the start of the sentence? Or is the phrase "CrossOver:" somehow blocking the rest of the sentence from being read? Are colons evil? I'm genuinely mystified as to why (from a structural standpoint) our existing title tag is failing this test, and I'd be delighted for answers and/or feedback. Thanks in advance.
Technical SEO | | CodeWeavers0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0