Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IYO, What is the Best Language to Build a Site With and Why? Thanks Catherine Corn
IYO, What is the Best Language to Build a Site With and Why? Thanks Catherine Corn
Local Website Optimization | | CatherineCorn10 -
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
Is this local guide best to follow?
Today I found below guide, Is this best guide to follow for the website and service pages content, layout design? http://www.ducttapemarketing.com/blog/guide-to-local-seo/
Local Website Optimization | | Michael.Leonard0 -
Site Getting hacked
Hi There, My one Website gets hacked Again and Again, I had Reset Many times ,But again, Also generating unnecessary URLs to My website in Webmaster tools, Can anyone Help Me To Solve This Problem please? please help, thx in advance,
Local Website Optimization | | nupuriepl0 -
Schema markup for a local directory listing and Web Site name
Howdy there! Two schema related questions here Schema markup for local directory We have a page that lists multiple location information on a single page as a directory type listing. Each listing has a link to another page that contains more in depth information about that location. We have seen markups using Schema Local Business markup for each location listed on the directory page. Examples: http://www.yellowpages.com/metairie-la/gold-buyers http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location. Other sites such as Yelp etc have no markup for a location at all on a directory type page. We want to stay with schema and leaning towards the superpages option. Any opinions on the best route to go with this? Schema markup for logo and social profiles vs website name. If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup https://developers.google.com/structured-data/customize/social-profiles If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup. https://developers.google.com/structured-data/site-name We want to have the markup for the logo, social profiles and website name. Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?). Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website. Does this matter? Will Google be ok with the logo and social profile markup if we use the @website designation? Thanks!
Local Website Optimization | | HeaHea0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
2 Relevant local websites but closing one and redirecting it to an older site
We have 2 websites, 1 domain is about 10 years old and another is about 4 years old, the 4 yr old domain we are thinking of shutting down since its the same type of service we run but it was a 'keyword domain' that used to rank on 1st page but now its 4th page back. If we put the blog posts and other content + setup re-directs from the 4yr old domain to the 10 yr old domain, would this help the 10 yr old domain with more link juice that it might need for the extra boost? There isnt really any point having both websites up since both are about the same content and targeting the same local market.
Local Website Optimization | | surfsup0 -
URL structure for local SEO
Hi fokes, question; which url structure is best for local rankings. For example: when I want to rank on the keyword: "Plumber Londen". And I dont have plumber in my brand. What is the best url structure: example.com/plumber/londen example.com/plumber-londen
Local Website Optimization | | remkoallertz1