Geo content and where Googlebot crawls from.
-
Does anyone have experience with geo-specific content on their homepage and how the location of the Googlebot impacts rank and/or traffic?
I ask because looking in Search Console today, I noticed the thumbnail image of our site is different than usual and it was pulling in a specific geo-location and wondered if there is any value/concern on how Google sees our site from different locations and if it could impact SERP's.
-
Google is location-agnostic, though will act like it cares about location depending on the location of the search. If it pulled the wrong thumbnail for you, it got there via a link (internal or external) and felt that is an appropriate result. What you do now depends on your goal (change the thumbnail for example).
It's good that you appear with a geo-targeted piece of content. This means you're responding to local searches. Google will show different SERP results for every person and location, so there isn't much value/concern over how they see your site. They see it from "all" locations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console. Here's what I think we may have done wrong. Let me know what you think... We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.) We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.) We didn't use rel canonicals to tell search engines that these pages were special made for the areas. We didn't use alt tags until about halfway through. A lot of the urls don't use the target keyword exactly. The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages. We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that) We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it? I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair". Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ I appreciate your time and concern. Have a great weekend!
Local SEO | | everysecond0 -
Category pages are treated as duplicate content - is that a problem?
Hi there I have analyzing a webshop where we sell products for pets, gardening and the like. I am getting a lot of "Duplicate Content" alerts from Moz when doing a site crawl and I am told that the pages for e.g. cat products and gardening tools show duplicate content. Those two pages contain no identical products, so I am guessing that it is just the "set up" of the page (they look almost identical, except for the products). My question is: Is this really a problem? Does it affect my ranking in a negative way, and if so, how can I counter it? Best regards Frederik
Local SEO | | fhertzp0 -
Searchmetrics Google ranking factors study says content gaining while links losing in importance ? Any View About this Post.
I am very Curious about it anyone please update about this http://searchengineland.com/searchmetrics-google-ranking-factors-study-says-content-gaining-links-losing-importance-265431
Local SEO | | MTPixels0 -
How can I personalize content based on a state/region? Is it possible?
I'm getting a lot of traffic from different regions throughout the US. I need to personalize the content in my website or on a certain landing page based on the users state/region. Is it possible? For example, forwarding a user that lands on page "x" to page "y" if he's from California and to page "z" if he's from South Carolina. And of course, can this somehow affect my rankings in Google? Thanks!!
Local SEO | | OrendaLtd0 -
Attacking Doorway/Thin Content pages?
What's the best way to approach fixing thin "city+services" pages? Would recommend doing one page at a time? Or doing a little on a bunch of pages at a time? For example, rewrite one page with 1,000 words of unique content, adding city specific images/videos of services rendered and local testimonials over the course of a week? Then go to the next page the following week? Or, one week add city specific images/videos to all the pages you can? Then, the next week add something else to all the pages? I'm trying to figure out the best way to scale this, and also which way google/search engines would prefer/look more kindly at? Thanks, Ruben
Local SEO | | KempRugeLawGroup0 -
Building Great Content
When writing content. Let's say I write fantastic useful content that most home buyers (since I'm a realtor) would benefit from, but they don't have a website, so they aren't going to link back to me anywhere. Whats the best way to get your content seen? Do you recommend putting it on facebook and promoting it? It's just tough in my business because it's such a commodity but I know there has to be a way. I'm just trying to see the best way before I spend TONS and TONS of time on writing actual useful and great content. As of now it's been a risk vs. reward thing and I haven't done it, but I feel like now is the time. Thanks!
Local SEO | | Veebs0 -
Is it necessary to implement hreflang for translated content on different ccTLDs?
Hello there, new MOZ here. I hope someone of the international SEO MOZs can share their opinion on a doubt I have. I've been reading a lot about hreflang and I understand the importance for subdomains and subfolders not only for targeting the same language in different countries (.com, .co.uk, .ca, etc) but also for websites partially or fully translated in other languages. However for these I've always seen examples where you want to have hreflang with subdomains or folders e.g. ru.example.com ; example.com/ru What if I have my translated websites on different ccTLDs - i.e. example.com example.ru. example.br example .fr Do I still need to implement hreflang or in this case is not necessary?
Local SEO | | selectitaly0 -
Content Across International Websites
I am wondering if anyone could clear up some questions I have regarding international SEO and how to treat the content placed on there. I have recently launched several websites for a product internationally, each with the correct country domain name etc and I have also followed the guidelines provided by webmaster tools on internationalisation. All the websites are targeted towards English speaking countries and I have rewritten most of the of the content on there to suite the English style of the targeted country. This is being said however I am finding mixed bags of information on what to do in treating large chunks of potential duplicate content. For example my main .com website which has been running several years (and is targeted to the UK) has a lot of well written articles on there which are popular with the visitors. I am needing to find out if duplicating these articles onto the international versions of the websites, without rewriting them, would have a detrimental effect on SEO between all the sites. I have done a site search for each domain name to see if they are cropping up in other local Google versions (e.g .ca site in Google.com.au etc) and they are not. Does this mean Google is localised to its results regarding duplicate content or is it treated at the root level? Any information to point me in the right direction would be a big help.
Local SEO | | Rj-Media0