Location Based Content / Googlebot
-
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
-
I believe the current progress is pretty much relevant to user but do provide the option to change the location if user want to manually change it! (it will be a good user experience)
To get all links crawled by search engine, here are few things that you should consider!
- Make sure sitemap have all links appearing that have on the website. Including all the links in the xml sitemap will help Google to consider those pages
- Point links to all location pages. This will help Google to consider indexing those pages and make it rank for relevant terms.
- Social Signals are important try to get social value of all location pages as Google usually crawl pages with good social value!
I think the current approach is awesome just add manually change location option if a visitor wants it.
-
Thanks Jarno
-
David,
well explained. Excellent post +1
Jarno
-
Hi,
In regards to the geo-targeting, have a read of this case study. To me it's the definitive guide to the issue as it goes through most of the options available, and offers a pretty solid solution:
http://www.seomoz.org/ugc/territory-sensitive-international-seo-a-case-study
And if you are worrying about the white/black aspects of using these tactics, here is a great guide from Rand on acceptable cloaking techniques:
http://www.seomoz.org/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
And finally a great 'Geo-targetting FAQ' piece from Tom Critchlow:
http://www.seomoz.org/blog/geolocation-international-seo-faq
In regards to the other locations ranking that you don't think have been crawled, this is probably down to the number/strength of the links pointing at this sections. Google have stated in various Webmaster videos that a page doesn't neccessarily need to be crawled to be indexed (weird huh?), Google just needs to know it exists.
If there were plenty of links point at a page, Google would still believe it's an authoritative/relevant result even if it hasn't crawled the page content itself. It can use other signals such as anchor text to determine the relevancy for a given search term.
Here is an example video from Matt Cutts where he discusses the issue:
http://www.youtube.com/watch?v=KBdEwpRQRD0
Best of luck
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Set Canonical for Paginated Content
Hi Guys, This is a follow up on this thread: http://moz.com/community/q/dynamic-url-parameters-woocommerce-create-404-errors# I would like to know how I can set a canonical link in Wordpress/Woocommerce which points to "View All" on category pages on our webshop.
Technical SEO | | jeeyer
The categories on my website can be viewed as 24/48 or All products but because the quanity constantly changes viewing 24 or 48 products isn't always possible. To point Google in the right direction I want to let them know that "View All" is the best way to go.
I've read that Google's crawler tries to do this automatically but not sure if this is the case on on my website. Here is some more info on the issue: https://support.google.com/webmasters/answer/1663744?hl=en
Thanks for the help! Joost0 -
Duplicate content for vehicle inventory.
Hey all, In the automotive industry... When uploading vehicle inventory to a website I'm concerned with duplicate content issues. For example, 1 vehicle is uploaded to the main manufacturers website, then again to the actual dealerships website & then again to Craigslist & even sometimes to a group site. The information is all the same, description, notes, car details & images. What would you all recommend for alleviating duplicate content issues? Should I be using the rel canonical back to the manufacturers website? Once the vehicle is sold all pages disappear. Thanks so much for any advice.
Technical SEO | | DCochrane0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Canonical pagination content
Hello We have a large ecommerce site, as you are aware that ecommerce sites has canonical issues, I have read various sources on how best to practice canonical on ecommerce site but I am not sure yet.. My concert is pagination where I am on category product listing page.. the pagination will have all different product not same however the meta data will be same so should I make let's say page 2 or 3 to main category page or keep them as is to index those pages? Another issue is using filters, where I am on any page and I filter by price or manufacturer basically the page will be same so here It seems issue of duplicate content, so should I canonical to category page only for those result types? So basically If I let google crawl my pagination content and I only canonical those coming with filter search result that would be best practice? and would google webmaster parameter handling case would be helpful in this scenario ? Please feel free to ask in case you have any queries regards
Technical SEO | | CNMOnline28
Carl0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0