Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to rank a national home page for a local keyword phrase
Hello - We are a nationally available brand based in Denver, CO. Our home page currently ranks #8 (used to be 5) for "real estate photography in Denver" -- I want to improve this ranking, but our home page is generalized and not geared toward Denver, CO but to all of our markets. I'm trying to troubleshoot this and have a few ideas.... I would love advice on the best route, or a different route altogether: Create a Denver-specific page -- _will that page compete with my home page that is already ranked in the top ten? _ Add the keyword phrase in the image alt attribute Add keyword phrase into the content - need to make sure that viewers realize we are national I already updated the meta description to say "real estate photography in Denver and beyond"
Local Website Optimization | | virtuance_photography1 -
Google can't discern the identity of my site
I have a website, http://NewYorkJazzEvents.com, that promotes jazz bands that are available for brides looking to hire a jazz band to perform at their wedding, or event planners looking to hire a jazz band to perform for a corporate event, etc. This identity, that my site is an Entertainment Agency, is made clear by all of the content on my site, as well as all of the content on its associated sites (such as its linked Facebook, YouTube, and Google Business pages, and many local citations). Yet, contrary to all of this data, the mere presence of the word "events" in my URL and business name has led Google to believe that my site is a Live Jazz Guide, i.e., a site that lists public performances of jazz groups in New York City. The problem, then, is that Google displays the site when people search for local events listings, and not when they search for jazz bands to contract for private events. For example, do a search for "jazz bands new york" and up pops the listings for sites catering to searchers looking to hire bands for private events, like Gigmasters, Gigsalad, right at the top of the list, followed by lots of individual bands. My site is buried (in my results, anyway), on the middle of page 2. (My paid Adwords ad, on the other hand, shows up at the top of paid ads.): https://www.dropbox.com/s/sv4we4gvnb6wkyb/Screenshot%202016-04-11%2019.22.40.png?dl=0 Now do a search for "new york jazz events." Boom! I'm #1 in the natural results, and, unlike in the search for "new york jazz band," my Google plus page and map (or is it the "knowledge graph"?) display right at the top of the right column: https://www.dropbox.com/s/nob24x1b8u1g4or/Screenshot%202016-04-11%2019.18.49.png?dl=0. (Pretty useless to people searching for live jazz listings in New York, though.) (This, by the way, is an additional related frustration: why does Google display all of its local information (its map, links to my Google reviews, etc.) next to my site listing when people are searching for events, but but hides this valuable information next to my site listing when people are search for jazz bands (when my site comes up on page 2)?) For a further confirmation of Google's confusion, see this data from Google that indicates the top search queries that it is using to display my site are centered around searches for local live jazz listings: Google Search Console > Search Traffic > Search Analytics > Queries: https://www.dropbox.com/s/t8blxv6a077iuw6/Screenshot%202016-03-07%2012.28.38.png?dl=0 See also see this data from Google that indicates that it see "events" (which it understands as local live jazz listings) rather than "new york jazz bands" as the essential keyword describing the identity of the site: Google Search Console > Google Index > Content Keywords: https://www.dropbox.com/s/6nk6skfgx9zjzgc/Screenshot%202016-03-07%2012.46.04.png?dl=0 It's been this way for several years. I thought Google was supposed to be smart, but it's pretty dumb in this case (all the other search engines, including Bing, are quite a bit more intelligent). All this trouble, essentially from a word within a URL? Does anyone have an idea of the cause of this issue, and any potential cures? What can I do to clear up Google's confusion?
Local Website Optimization | | ChuckBraman0 -
Server response time: restructure the site or create the new one? SEO opinions needed.
Hi everyone, The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded. Now we have two options (same price): restructure the site's modules, panels etc create new site (recommended by developers)
Local Website Optimization | | Ryan_V
Both options will extend the same design and functionality. I just wanted to know which option SEO community will recommend?0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
Has anyone had any success buying a local domain website, getting it on first page and then selling it to a local business? I have found some good domains that this might work for but I am wondering if anybody has tried this before.
I would like to buy a local domain like scottsdalepaintingcontractor.com and then seo it to first page before I sell it. Has anybody tried such strategy?
Local Website Optimization | | BWoods3 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0