Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mysterious Location Based SERP Disappearance
Hi Everyone, I've got a bit of a confusing SEO issue which I'm hoping you'll be able to help with. Apologies in advance for the long post, I've put an abridged version below also. We have one main keyword and it seems to have disappeared in some locations. The main keyword is "clothing manufacturers" and up until recently we had stability for almost a year. We're based in London, England and we regularly check "clothing manufacturers" to see where we're showing in search, and we usually see between 3rd - 5th. We use AHREFS to track rankings and noticed recently that "clothing manufacturers" had disappeared totally. We asked some people in different areas of the country to check where we were showing in search - one in Somerset, one in Liverpool, one in Beckingham and we used a VPN in Manchester. In all of these areas we aren't ranking for our main keyword at all. In London though we're 5th which is the lower end of normal. We then checked other keywords and it turns out "Clothes manufacturers" is one we're also not ranking for outside of London. However for "clothing manufacturers uk" and "clothes manufacturers uk" we are ranking for in every location we have tried. "Clothing manufacturers uk" is currently the keyword which brings us the most traffic. There are no manual penalties in webmaster tools, but looking at analytics it looks like our impressions for the main keyword have been down over the past 90 days, so we think we have had a problem and not realised for some time. Around a week before we see that our traffic for "clothing manufacturers" dropped, we made some structural changes to the website homepage, where we added LSIs, more H2s, more long tail keywords and more content, taking the copy from around 500 words to around 1100 words. This was in an effort to make the homepage less keyword stuffed and more natural. As a result of this we saw an overall increase in traffic and enquiries, and that's the reason we didn't notice for so long that traffic from "clothing manufacturers" has dropped so badly. Our first thought is that this might be something to do with Schema. Our website was until last week using a schema which included our "postal address" which is our physical office location in London. The schema was implemented in June 2017 and we have noticed that 3 months after implementing the schema, in October, our traffic fell dramatically for our main keyword, "clothing manufacturers". At the same time, our traffic for "clothing manufacturers uk" increased dramatically. Interestingly, the schemas used by our competitors don't include their office addresses and they show up all over the country for "clothing manufacturers" and "clothes manufacturers". One of our competitors is physically within half a mile of us. Have you guys seen a schema limit a company to searches only in one locality before? We have now removed the address from the schema to see if we start ranking all over the country again, like we used to before we implemented it. If this is the problem then it could take 3 months to turn around like it did for us to get in to this situation (Schema implemented June 2017, traffic fell October 2017). We're therefore trying to investigate every possibility to ensure we leave no stone unturned. Do you have any thoughts on the problem and if it could be schema related, or possibly something else? Thank you in advance! TL:DR Keywords "clothing manufacturers" and "clothes manufacturers" no longer ranking around the UK. Still ranking in London where we are based. Still ranking well for "clothing manufacturers uk" and "clothes manufacturers uk". Traffic for "clothing manufacturers" dropped 3 months after implementing schema and one week after making changes to website homepage (increased word count, added long tail keywords, LSIs and H2s). Schema included "postal address" which we notice none of our competitors have. They rank all over the country for "clothing manufacturers". One of our competitors is based within half a mile of us in London. Could having the address in the schema limit us to one locality? Could it be something else entirely?
Local Website Optimization | | rswhtn0 -
Google showing 3 different results for homepage
Hello All! First post in this community. I hope someone can help with an issue I'm having with my website, 3vdental.com. 1. When I Google my brand name, 3V Dental Associates, I see one result on the front page. This result shows ONLY my brand name as the title tag... See here: https://www.screencast.com/t/Vwq4l2Lrn 2. When I Google my domain name, 3vdental.com, I see a second result, that still only shows my brand name as the title tag, but with sitelinks showing in the results... See here: https://www.screencast.com/t/L37hxZ8rd1xp 3. Both results above are not ideal, as they are not displaying the correct title and meta tags set within the Yoast SEO plugin. Here's a front-end view of the site displaying the correct title and meta tags.. See here: https://www.screencast.com/t/CZS3CBja4m Is there any way to correct this so that Google displays my preferred tags when my website is displayed? Thanks for your help in advance!
Local Website Optimization | | Visionisto0 -
Best practice for local keyword ranking in URLs
Hi, I have a large artificial grass website with many franchise location landing pages. At the moment i have most of the landing page URLs like this www.domainname.com/uk/city/ My TLD does not contain the keyword "artificial grass" so should I follow the location with the keywords /city-artificial-grass/ or is Google pretty savvy these days and will it know that I am an artificial grass company? I'm after the best recommendations for this if possible. Thanks
Local Website Optimization | | Easigrass0 -
Ranking nationally but not locally
Hi everyone, I'm working with a client that has a strange situation. He's ranking for his target keyword on a national level but when searching locally, he's in the 100s (see attached). Any idea what could be going on here? He did have an old domain that got hacked that is redirecting to his current domain. Thanks, Tim lmSSXdT
Local Website Optimization | | TimKelsey0 -
Does multiple sites that relate to one company hurt seo
I know this has been asked and answered but my situation is a little different. I am a local electrical contractor. I specialize in a service and not a product. Competition is high in the local market due to the other electrical contractors that have well seasoned sites with very good DA/PA. Although new to the web I am not new to the trade. Throughout years almost back to the AOL dialup days I have been collecting domain names for this particular purpose. Now I want to put them to good use. Being an electrical contractor, there are many different facets of work and services we provide. My primary site is empireelec.com A second site I threw online overnight with minimal content is jacksonvillelightingrepair.com. Although it is a fresh site, there is minimal content and I have put almost zero effort in to it. It appears to be ranking for keywords a lot quicker. That leads me to believe I should utilize my other domain jacksonvillefloridaelectrician.com and target just the keyword Jacksonville Florida Electrician. It leads me to believe I should use jacksonvillebeachelectrician.com for targeting electricians in jacksonville beach. And again with jacksonvilleelectricianservice.com I can provide a unique phone number for each site. Am I going about this all wrong? Everything I read says no,no,no but I feel my situation is a little more unique.
Local Website Optimization | | empireelec1 -
How to create sites with powerful individual pages to achieve top results.
How to create sites with powerful individual pages to achieve top results . According to MOZ I need to have powerful individual pages to achieve top results my site has a 0 authority so for this reason I need to focus on powerful pages but how do I know if my pages are powerful or not.
Local Website Optimization | | A.V.S0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0