Canonical's, Social Signals and Multi-Regional website.
-
Hi all,
I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country.
I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region.
In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web.
We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns
Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it?
We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat?
I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/.
If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall.
I would be interested to hear your feedback.
Thanks
-
Hi Jordan,
I've sent you a private message.
Thanks mate.
-
From my understanding and the research I have done 302's do not pass much if any link juice. Matt cutts has an article here explaining some of the differences and Moz does an excellent job as well explaining the benefits between the two. Since I do not have your url I can not dig into the issue too much. If you would like you can send me your businesses url and I can take a look for you. But I do not think the issue with the site is social signals it could be a technical issue.
-
Hi Jordan,
Thanks for the response. Good to know about social signals.
In terms of the 302, my original understanding (through online research) was that by using a 302, google would report the original URL in search results but grab the display content from the page that it was redirected to. This is due to the fact the the redirected URL is based on the geographical location of the user.
If 302's do not pass any link juice, does that mean that we wouldn't be gaining any benefit from all external websites linked to my root domain /, given that when they hit the root domain, it ends up with a 302 to their geographical region?
Thanks.
-
Generally social signals do not hold much sway in terms of ranking. John Mueller from Google stated that Google does not use social signals in its ranking algorithm. More than likely there is an underlying factor playing a role in why you may be ranking poorly. Go to opensite explorer and paste the url to your site and see if there are any technical errors that may be playing a role in you ranking poorly. I also recommend crawling your site with screaming frog or one of Moz's tools.
Also, you said your system automatically redirects users with a 302 redirect these pass little to no link juice I would try and see how many 302 redirects you have in place and consider changing them to a 301 redirect. Moz has a good page about redirects.
Hope this helps some!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
Is the image property really required for Google's breadcrumbs structured data type?
In its structured data (i.e., Schema.org) documentation, Google says that the "image" property is required for the breadcrumbs data type. That seems new to me, and it seems unnecessary for breadcrumbs. Does anyone think this really matters to Google? More info about breadcrumbs data type:
Intermediate & Advanced SEO | | Ryan-Ricketts
https://developers.google.com/search/docs/data-types/breadcrumbs I asked Google directly here:
https://twitter.com/RyanRicketts/status/7554782668788531220 -
Brand sections performing badly in SERP's but all SEO tools think we are great
I have had this problem for some time now and I've asked many many experts. Search for Falke in Google.co.uk and this is what you get: http://www.sockshop.co.uk/by_brand/falke/ 3rd Our competitor
Intermediate & Advanced SEO | | jpbarber
http://www.mytights.com/gb/brand/falke.html 4th Our competitor http://www.uktights.com/section/73/falke 104th this is us ????? 9th for Falke tights with same section not our falke tights section? All sites seem to link to their brand sections in the same way with links in the header and breadcrumbs, Opensite exporler only shows 2 or 3 internal links for our compertitors, 1600+ from us?
Many of our brand sections rank badly Pretty Polly and Charnos brands rank page 2 or 3 with a brand subsection with no links to them, main section dosn't rank? Great example is Kunert, a German brand no UK competition our section has been live for 8 years, the best we can do is 71st Google UK, 1st on Bing (as we should be). I'm working on adding some quality links, but our comtetitors have a few low quality or no external links, only slightly better domain authority but rank 100+ positions better than us on some brands. This to me would suggest there is something onpage / internal linking I'm doing wrong, but all tools say "well done, grade A" take a holiday. Keyword denisty is similar to our competiors and I've tried reducing the number of products on the page. All pages really ranked well pre Penguin, and Bing still likes them. This is driving me nuts and costing us money Cheers Jonathan
www.uktights.com1 -
Bing flags multiple H1's as an issue of high importance--any case studies?
Going through Bing's SEO Analyzer and found that Bing thinks having multiple H1's on a page is an issue. It's going to be quite a bit of work to remove the H1 tags from various pages. Do you think this is a major issue or not? Does anyone know of any case studies / interviews to show that fixing this will lead to improvement?
Intermediate & Advanced SEO | | nicole.healthline0 -
Why are our sites top landing pages URL's that no longer exist and retrun 404 errors?
Digging through analytics today an noticed that our sites top landing pages are for pages that were part of the old www.towelsrus.co.uk website taken down almost 12 months ago. All these pages had the 301 re-directs which were removed a few months back but still have not dropped out of Googles crawl error logs. I can't understand why this is happening but almost certainly the bounce rate on these pages (100%) mean we are loosing potential conversions. How can I identify what keywords and links people are using to land on these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
Is it worth submitting a blog's RSS feed...
to as many RSS feed directories as possible? Or would this have a similar negative impact that you'd get from submitting a site to loads to "potentially spammy" site directories?
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Does Google crawl the pages which are generated via the site's search box queries?
For example, if I search for an 'x' item in a site's search box and if the site displays a list of results based on the query, would that page be crawled? I am asking this question because this would be a URL that is non existent on the site and hence am confused as to whether Google bots would be able to find it.
Intermediate & Advanced SEO | | pulseseo0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0