Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are local business directories worth the effort? Eg. White pages, Yell.com, Local.com?
Hi Guys, Im new to Moz and very keen to do SEO right without upsetting Mr. Google too much. Are local business directories worth the effort? Its a laborious job, but happy to do it, if its effective and won't be considered spammy by Google? Thanks
Local Website Optimization | | Fetseun0 -
I have a Wordpress site that ranks well and a blog (uses blogger) with slightly different URL/domain that also ranks decently. Should I combine the 2 under the website domain or keep both?
I realize that I am building essentially 2 different sites even though they are connected, but on some local town pages i have 2-3 results on Page #1. Nice problem to have eh? But i am worried as for a lot of my surrounding towns my competitor has the top listing or definitely ahead of me, so i am wondering if i combine or convert my blog into the same domain as my site, then all of that content + links should hopefully propel my site to #1. Anyone have an experience like this? thanks, Chris
Local Website Optimization | | Sundance_Kidd0 -
Optimizing Local SEO for Two Locations
Hi there! I have a client that has just opened a 2nd location in another state. When optimizing for local I have a few questions: We're creating a landing page for each location, this will have contact information and ideally some information on each location. Any recomendations for content on these landing pages? The big question is dual city optimization. Should Include the city & state of BOTH locations in all my title tags? or should I leave that to the unique city landing pages? What other on-page optimizations should i consider across the site? Thanks! Jordan
Local Website Optimization | | WorkhorseMKT0 -
I have 5 sites each targeting a different service my company offers, should I consolidate to one site or merge to one?
I run a photo booth company and have a site for each service I offer. Are smaller sites that are SEO for each service stronger than just having pages for each service on one mother site?thanks,
Local Website Optimization | | hashtagltd0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Listing bundle info on site and on local SEO page.
We just finished a new telecom site, and like all telecom sites (think AT&T, Verizon, Suddenlink, etc.), we allow people to put their location in and find internet and phone service packages (what we call bundles) unique to their area. This page also has contact information for the local sales team and some unique content. However, we're about to start putting up smaller, satellite pages for our local SEO initiative. Of course, these pages will have unique content as well, but it will have some of the same content as what's on the individual bundle page, such as package offerings, NAP, etc. Currently this is the URL structure for the bundles: domain.com/bundles/town-name/ This is what I'm planning for the local SEO pages: domain.com/location/town-name-state/ All local FB pages, Google listings, etc. will like to these location pages, rather than the bundle pages. Is this okay or should I consolidate them into one?
Local Website Optimization | | AMATechTel0 -
Can I use a state's slang term for local search?
Have a business located in Indianapolis, Indiana. The business name will be BusinessName Indy. The URL will be BusinessName-Indy.com Since I am using Indy instead of Indianapolis or Indiana, is Google's algorithm smart enough to match up local results to my site?
Local Website Optimization | | StevenPeavey1 -
Local Rank riddle
Here is a very odd scenario which to me makes very little sense. How can a site rank on Page #1 of Google for let's say "Boston party planner" yet on Page#2 for "party planner Boston"?? Would love some insight on this one. thanks, Chris
Local Website Optimization | | Sundance_Kidd0