Do I use a .org or .co.uk domain to help UK rankings?
-
Hi Guys,
I own to good domains one with a .ORG and the other .CO.UK
Can anyone advise which one is best to use to help UK rankings? Or does it not make much difference??
Thanks guys
Gareth
-
Thank you appreciated
-
Thanks thats great!
G
-
If you website is specific to the UK only then .co.uk would be best, if you are a non-for-profit you may consider .org.uk.
-
Hi,
If you want ranking in UK the you can use .co.uk domain and also set GEO target in Webmaster tool. It is good practice to increase visitors and ranking in UK. Do link building which have UK as their Geo location. .ORG domain is also an authority domain but as per my view .co.uk is best selection.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Nice Domain Authority but Not Ranking
Hi, A client of mine who owns a website reached out to me. He got penalized a while ago and has long since recovered (not sure exactly, but for sure a year). His domain authority is in the upper 30s but is still not ranking for many of his keywords that he ranked on the first page. I am not so familiar with the technical aspects of penalties and such, but is this a common scenario? Why is his domain authority great but his ranking downright awful? Does he have a chance if he builds great links, or is something else wrong that we can't figure out?
Intermediate & Advanced SEO | | Rachel_J0 -
6 months Later - 0 Domain Authority/Page Authority and losing Rankings
Hi Moz, Sorry if this comes across as a "Do My Job For Me" type of post but we are an E-Commerce store that have been live since January but have not seen any increase in performance on our site and over the past month, have even seen our rankings decrease. We have 1300 products on site and about 1500 pages in total. 1. As for on-site optimization, we have got 2 reviews and follow up reviews with a highly reputable reviewer from People Per Hour and solved any issues she has found. 2. Updated the Meta Data for products and Alt Descriptions for images focusing on the keywords we wish to rank for. We post weekly blogposts linking back to our products. 3. Social Media Campaigns with regular campaigns on FaceBook, Pinterest, Google+ and Twitter. 4. Attempted to build FOLLOW backlinks to articles relating to products on our site. We have also considered purchasing backlinks to improve our situation as we have yet to see any of these pages be crawled by Google over a month later. I have read a guides on Moz and other sites on how to improve our authority and improve rankings but none have offered much by way of practical solution. My question being, is this just a matter of patience or should I be worried/improving anything given we have 0 Domain Authority and Page Authority on all pages? Thanking you in advance, SEO Novice.
Intermediate & Advanced SEO | | csworkwear0 -
Rankings Drop since Humingbird - Could it be my link ratio between .co.uk / .com ?
Hi All, I have an UK tool hire eccomerce muliti location website with different locations pages for each category. My stratedgy has been to specialise on local search for each location as oppose to try and compete with highly competitive keywords on a national level. I do have some duplicate/ thin content issues on these location pages but I've been actively writting additional unique content on these pages to address this issue which also making sure my title tags, h1 , h2 tags etc are unique for each location along with having individual google local + pages etc etc. I have never previously been affected by any duplicate contents issues and always ranked first page (mainly top 5) for most of my local keywords). However, when google humingbird update came out , I suffered approx 25% drop in traffic and rankings. rom what I read , local search sites have suffered somewhat in this update and I did a link detox report to try and asterain toxic links etc. I found a few which I disavowled but I have had no manul penalty message in my GWT so I can only assume I was affected by an google algorithmic penalty. From looking at opensite explorer , I can see my link ratio for my .co.uk site shows 43% .com 37% .co.uk I am wondering if it could be this which has been the cause of my local rankings to fail ?. Has anyone else suffered the same as I am at my witts end as to what are the likely factors which could have caused such a drop ? Any tips, greatly appreciated. Happy to give my sites url if anyone would like to take a look ? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Site rankings down
Our site is over 10 years old and has consistently ranked highly in google.co.uk for over 100 key phrases. Until the middle of April, we were 7th for 'nuts and bolts' and 5th for 'bolts and nuts' - we have been around these positions for 5-6 years easily now. Our rankings dropped mid-April, but now (presumably as a result of Penguin 2.0), we've seen larger decreases across the board. We are now 5th page on 'nuts and bolts', and second page on 'bolts and nuts'. Can anyone please shed any light on this? Although we'd fallen some before Penguin 2.0, we've fallen quite a bit further since. So I'm wondering if it's that. We do still rank well on our more specialised terms though - 'imperial bolts', 'bsw bolts', 'bsf bolts', we're still top 5. We've lost out with the more generic terms. In the past we did a bit of (relevant) blog commenting and obtained some business directory links, before realising the gain was tiny if at all. Are those likely to be the issue? I'm guessing so. It's hard to know which to get rid of though! Now, I use social media sparingly, just Facebook, Twitter and G+. The only linkbuilding I do now is by sending polite emails to people who run classic car clubs that would use our bolts, stuff like that. I've had a decent response from that, and a few have become customers directly. Here's our link profile if anyone would be kind enough as to have a look: http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com Also, SEOMOZ says we have too many links on our homepage (107) - the dropdown navigation is the culprit here. Should I simply get rid of the dropdown and take users to the categories? Any advice here would be appreciated before I make changes! If anyone wants to take a look at the site, the URL is in the link profile above - I'm terrified of posting links anywhere now! Thanks for your time, and I'd be very grateful for any advice. Best Regards, Stephen
Intermediate & Advanced SEO | | stephenshone1 -
Geo-Domain Centralization - Helps or Hurts a Long-Term Campaign?
I have a client with nearly 100 geo-specific domains (example: serviceincity.com). The content is mostly duplicate, however they weren't affected by Panda or Penguin, and most of the domains have a PR2-PR4. Doesn't mean they won't eventually (I know). My strategy is to centralize all the city domains and 301 them to their main website (example: brandname.com/locations/city/). However, their IBL profile shows at least 50% of their IBLs coming from the geo-specific domains, which makes centralizing quite a scary thing for short-term ranking. Having these domains is obviously not scalable from a social media or video SEO perspective, and we all know that in the long-term brand rules and domaining drools. Before I suggest they that they 301 these domains, I thought I'd get feedback from the community. Will all that 301 redirecting give more weight to the primary domain's visibility and sustain the ranking at a page-level, or will it send a flag to Google that the site might have been using it's own network of websites to game results? (which wasn't the case, the owner was just hyper with dominating in each city). Thanks in advance for your feedback.
Intermediate & Advanced SEO | | stevewiideman0 -
Rankings Issue
Hey guys, and gals, So our site http://www.motorcyclecenter.com/ is having the hardest time ranking in the big G. We've built links, optimized, and all the other basics. If you guys have a chance, could you take a glance, and tell us what you think, as to why this site is having such a hard time. Much appreciated!
Intermediate & Advanced SEO | | leatherupseo0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1