Location Based Content / Googlebot
-
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
-
I believe the current progress is pretty much relevant to user but do provide the option to change the location if user want to manually change it! (it will be a good user experience)
To get all links crawled by search engine, here are few things that you should consider!
- Make sure sitemap have all links appearing that have on the website. Including all the links in the xml sitemap will help Google to consider those pages
- Point links to all location pages. This will help Google to consider indexing those pages and make it rank for relevant terms.
- Social Signals are important try to get social value of all location pages as Google usually crawl pages with good social value!
I think the current approach is awesome just add manually change location option if a visitor wants it.
-
Thanks Jarno
-
David,
well explained. Excellent post +1
Jarno
-
Hi,
In regards to the geo-targeting, have a read of this case study. To me it's the definitive guide to the issue as it goes through most of the options available, and offers a pretty solid solution:
http://www.seomoz.org/ugc/territory-sensitive-international-seo-a-case-study
And if you are worrying about the white/black aspects of using these tactics, here is a great guide from Rand on acceptable cloaking techniques:
http://www.seomoz.org/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
And finally a great 'Geo-targetting FAQ' piece from Tom Critchlow:
http://www.seomoz.org/blog/geolocation-international-seo-faq
In regards to the other locations ranking that you don't think have been crawled, this is probably down to the number/strength of the links pointing at this sections. Google have stated in various Webmaster videos that a page doesn't neccessarily need to be crawled to be indexed (weird huh?), Google just needs to know it exists.
If there were plenty of links point at a page, Google would still believe it's an authoritative/relevant result even if it hasn't crawled the page content itself. It can use other signals such as anchor text to determine the relevancy for a given search term.
Here is an example video from Matt Cutts where he discusses the issue:
http://www.youtube.com/watch?v=KBdEwpRQRD0
Best of luck
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Z-indexed content
I have some content on a page that I am not using any type of css hiding techniques, but I am using an image with a higher z-index in order to prevent the text from being seen until a user clicks a link to have the content scroll down. Are there any negative repercussions for doing this in regards to SEO?
Technical SEO | | cokergroup0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Duplicate Content Errror
I am getting a duplicate content error for urls for the "tags" or categories pages for my blog. These are some the URLs that SEOmoz is saying are errors, or duplicate pages. http://sacmarketingagency.com/blog/?Tag=Facebook http://sacmarketingagency.com/blog/?Tag=content+marketing http://sacmarketingagency.com/blog/?Tag=inbound+marketing As you can see, they are just the pages that are aggregating certain blog post based on how we tagged them with the appropriate category. Is this really a problem for our SEO, if so any suggestions on how to fix this?
Technical SEO | | TalkingSheep0 -
Problem with duplicate content
Hi, My problem is this: SEOmoz tells me I have duplicate content because it is picking up my index page in three different ways: http://www.web-writer-articles.co.uk http://www.web-writer-articles.co.uk/ and http://www.web-writer-articles.co.uk/index.php Can someone give me some advice as to how I can deal with this issue? thank you for your time, louandel15
Technical SEO | | louandel150 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0 -
301 redirect to 1 of 3 locations based on browser languge? Is this ok?
Hi all, I'm taking over a site that has some redirect issues that need addressed and I want to make sure this is done right the first time. The problem: Our current setup starts with us allowing both non-www and www pages. I'll address this with a proper rewrite so all pages will have www. Server info: IIS and runs PHP. The real concern is that we currently run a browser detection for language at the root and then do a 302 redirect to /en, /ge or /fr. There is no page at the www.matchware.com. It's an immediate redirect to a language folder. I'd like to get these to a 301(Permanent) redirect but I'm not sure if a URL can have a 301 redirect that can go to 3 different locations. The site is huge and a site overhaul is not an option anytime soon. Our home page uses this: <%
Technical SEO | | vheilman
lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
real_lang = Left(lang,2)
'Response.Write real_lang
Select case real_lang
case "en"
Response.Redirect "/en"
case "fr"
Response.Redirect "/fr"
case "de"
Response.Redirect "/ge"
case else
Response.Redirect "/en" End Select
%> Here is a header response test. ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ HTTP Request Header Connect to 87.54.60.174 on port 80 ... ok GET / HTTP/1.1[CRLF] Host: www.matchware.com[CRLF] Connection: close[CRLF] User-Agent: Web-sniffer/1.0.37 (+http://web-sniffer.net/)[CRLF] Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7[CRLF] Cache-Control: no-cache[CRLF] Accept-Language: de,en;q=0.7,en-us;q=0.3[CRLF] Referer: http://web-sniffer.net/[CRLF] [CRLF] HTTP Response Header --- --- --- Status: HTTP/1.1 302 Object moved Connection: close Date: Fri, 13 May 2011 14:28:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET Location: /ge Content-Length: 124 Content-Type: text/html Set-Cookie: ASPSESSIONIDQSRBQACT=HABMIHACEMGHEHLLNJPMNGFJ; path=/ Cache-control: private Content (0.12 <acronym title="KibiByte = 1024 Byte">KiB</acronym>) <title></span>Object moved<span class="tag"></title> # Object Moved This object may be found <a< span="">HREF="/ge">here. +++++++++++++++++++++++++++++++++++++++++++++++++++++ To sum it up, I know a 302 is a bad option, but I don't know if a 301 is a real option for us since it can be redirected to 1 of 3 pages? Any suggestions?</a<>1