Serving different content based on IP location
-
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B.
Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B.
My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript?
We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient?
Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
-
Adding to Daniel's comment, I'd say the big difference "...through our faceted search." It's important to have both the XML entries and a crawl path. An XML sitemap may be enough to get the pages indexed, but they won't inherit any internal link-juice. That comes through your internal links. Somewhere, there needs to be a link that Google can crawl to the other cities.
The direct back-links will help, and should get you indexed and possibly ranking, but you're still losing the authority from the domain as a whole that you'd inherit via internal links. The upshot is that you'll lose ranking power.
-
I do the exact same thing (local business pages based on visitor IP) but you can change your location based on what search terms you enter.
What we also do is allow anyone to browse any state/city results through our faceted search and we have XML sitemap entries for each state/category landing page which will then link down to city level searches.
We have seen no problem with google indexing our site (currently almost 500,000 pages indexed.)
As long as you don't actively hide content that doesn't pertain to the requesting site IP and you provide some way for Google to find it, you should be OK.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
Duplicate content in external domains
Hi,
Intermediate & Advanced SEO | | teconsite
I have been asking about this case before, but now my question is different.
We have a new school that offers courses and programs . Its website is quite new (just a five months old) It is very common between these schools to publish the courses and programs in training portals to promote those courses and to increase the visibility of them. As the website is really new, I found when I was doing the technical audit, that when I googled a text snipped from the site, the new school website was being omitted, and instead, the course portals are being shown. Of course, I know that the best recommendation would be to create a different content for that purpose, but I would like to explore if there is more options. Most of those portals doesn't allow to place a link to the website in the content and not to mention canonical. Of course most of them are older than the new website and their authority is higher. so,... with this situation, I think the only solution is to create a different content for the website and for the portals.
I was thinking that maybe, If we create the content first in the new website, send it to the index, and wait for google to index it, and then send the content to the portals, maybe we would have more opportunites to not be ommited by Google in search results. What do you think? Thank you!0 -
Does Unique Content Need to be Located Higher on my webpages?
I have 1 page that ranks well with unique written content located high up on page (http://www.honoluluhi5.com/new-condos-in-honolulu/). I struggle to rank for 200+ other pages where unique content requires scrolling (ex: http://www.honoluluhi5.com/oahu/honolulu-homes/). I am thinking to do as follows: Change layout of all my pages to have unique content higher on page When users are on my site (not coming from search engines) and use my search filters, then users will land on pages where unique content is lower on page (so keep this layout: http://www.honoluluhi5.com/oahu/honolulu-homes/). I will then add these pages to my robots.txt file so they do not show in Google's index. Reason: unique content lower on page offers best user experience. With unique content higher on page, I expect bounce rate to increase about 10% (based on the 1 page I have with unique content higher), but I think it is worthwhile, as I am sure search engines will start having my pages rank higher.
Intermediate & Advanced SEO | | khi50 -
What are the SEO issues we should consider on a plug in that creates a custom home page based on zip code or GPS location.
We are developing a plug in the changes the home page relative to a users location or zip code. We believe this will provide users with a more personalized experience. We are concerned about how this might affect SEO. We are also wondering if we should partner with one of the SEO ply in developers. We were thinking about Yoast. Is there another partner that might be better? I would appreciate any feedback people can give.
Intermediate & Advanced SEO | | Ron_McCabe0 -
medical site with no unique content
Hi I'm trying to promote an ecommerce site that sells vitamins and health goods. The site owner doesn't want to add texts in the product pages because it is medical material. therefore he Currently has non unique (duplicated) content in each product page' It is the same exact content all others have (taken From the manufacturer)' Any ideas? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0 -
Duplicate Content Through Sorting
I have a website that sells images. When you search you're given a page like this: http://www.andertoons.com/search-cartoons/santa/ I also give users the option to resort results by date, views and rating like this: http://www.andertoons.com/search-cartoons/santa/byrating/ I've seen in SEOmoz that Google might see these as duplicate content, but it's a feature I think is useful. How should I address this?
Intermediate & Advanced SEO | | andertoons0