Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking for keywords locally with multiple locations
If we have a company with multiple physical locations across multiple states, but selling the same products, what would be an optimal strategy? All local locations have been claimed, but the site is not coming up for searches with local intent. If the corporate site focuses on the "products", what is the best way to get that associated with the individual locations as well? When implementing json+ld, would we put the specific location on the specific location pages and nothing on the rest? Any other tips would be great! Thanks in advance,
Local Website Optimization | | IDMI.Net0 -
Are local business directories worth the effort? Eg. White pages, Yell.com, Local.com?
Hi Guys, Im new to Moz and very keen to do SEO right without upsetting Mr. Google too much. Are local business directories worth the effort? Its a laborious job, but happy to do it, if its effective and won't be considered spammy by Google? Thanks
Local Website Optimization | | Fetseun0 -
Should I use Rel-Canonicals links for a News site with similar articles each year
Our small town news site provides coverage in a lot of seasonal areas, and we're struggling with the current year's content ranking above previous years. For instance, every year we cover the local high school football team, and create 2-3 articles per game. We'll also have some articles preseason with upcoming schedule and general team "talk". We've seen where articles from past seasons will rank higher than the current season, presumably because the older articles have more links to them from other sources (among other factors). We don't want to delete these old articles and 301 them to the newer article, since most articles include information/stories about specific players...and their families don't want the article to ever come down. Should we rel-canonical the older articles to the newer one, or perhaps to the "high school football" category page? If to the category page, should we rel-canonical even the new articles to that main category page? Thanks for the help.
Local Website Optimization | | YourMark.com0 -
How to create sites with powerful individual pages to achieve top results.
How to create sites with powerful individual pages to achieve top results . According to MOZ I need to have powerful individual pages to achieve top results my site has a 0 authority so for this reason I need to focus on powerful pages but how do I know if my pages are powerful or not.
Local Website Optimization | | A.V.S0 -
Server response time: restructure the site or create the new one? SEO opinions needed.
Hi everyone, The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded. Now we have two options (same price): restructure the site's modules, panels etc create new site (recommended by developers)
Local Website Optimization | | Ryan_V
Both options will extend the same design and functionality. I just wanted to know which option SEO community will recommend?0 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
Will NAP Schema Impact non local searches
Hi, Just got a business address and a toll free number for my website. I have read that adding the NAP details schema to the site gives that additional weight of trust to Google and also helps local search. Now my website is NOT local. However, if I add my LA address details on my website using the Local Business schema.org, it might give Google the impression that I am based out of CA. Fair enough, but my question is, will it impact negatively for SERPs from other states. For example I might want to rank for KW "Autism Alternative Treatment". Obviously now that I have added my NAP, if someone keys in Autism Alternative Treatment LA or Autism Alternative Treatment CA, google should give my site preference. But if someone searched Autism Alternative Treatment Arizona, will google exclude/downgrade me (even though there may not be a local site for Arizona) from the search results under the pretext that I am not Arizona based? Your suggestion would be very helpful.
Local Website Optimization | | DealWithAutism0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0