Nominet have made the geographic new TLD available for UK. How will this affect SEO?
-
Nominet have made a new TLD available, the .uk TLD. Some might argue that this is a cynical move by Nominet to get more money out of British businesses, but either way, we need to decide how we handle this. As I see it we have 4 options.
1. Do nothing - At the moment, only websites can register their .uk domain. That won't last for ever though, and eventually, if we don't register it, someone else will.
2. Register a domain but do nothing with it.
3. Register a domain and simply redirect it to the existing .co.uk domain. I suspect this is the best option.
4. Register the .uk domain and redirect the .co.uk domain to the new domain.From a technical point of view, what is the best option? For businesses that have multi-lingual sites the 4th appears the best option but why do we need to act when we do not even know the SEO value of any of this, and where Google sit regarding the new British TLD?
-
Hi guys
thanks for your responses. You've confirmed what I suspected, especially the implications concerning ccTLD's.
There are none. To be honest, it's more about being tidy and organised, and managing a domain's web presence in a systematic way, and not getting bogged down in other digital areas. If one gets their website techincally organised in a way that complies with the search guidelines, i.e redirects put where they should be, canonical urls correct, broekn links repaired, duplicate urls and pages removed and/or redirected, then I foresee fewer problems for SME's on the digital journey.
Cheers
-
ccTLDs are different from gTLDs (and .uk is also restricted to only UK businesses unlike .co.uk) in that they show a specific affinity for a country. Google recognizes this, as well as users.
The only SEO consideration is do you want to be seen as UK specific or more general? There's nothing from preventing you from doing either, but you do need to understand that signal is there with a ccTLD.
As far as buying yourdomain.uk and pointing it to yourdomain.co.uk there's no harm but no benefit either.
-
let's for just a second ignore the technical point of view I know how easy it can be to get caught up and look at it from a different angle-
Which option is easier to brand and for the user? if you've got a .co.uk you can secure the .uk and as you mentioned 301 job done no worries it will also stop some competition with the same name same results go for .com really.
You don't get an real benefit from any of the TLD's they all work the same so don't panic!
I wouldn't worry about Nominet they are not some large evil cooperation trying to get money, they are trying to free up domains to create more options rather than think wow there are no more URLS left on the internet!
My advice if you are worried is get the .uk redirect it into your all ready established .co.uk then if you want to at a later date swap you can, if a customer accidentally goes to the wrong domain again it helps and as I mention it also stops any competition.
you can find more info here -
http://moz.com/ugc/an-seos-guide-to-acquiring-new-gtlds
http://moz.com/learn/seo/domain
Hope it helps & good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Novice SEO Question - UK & COM Results
Would someone please explain to me why when doing this search https://www.google.co.uk/search?pws=0&q=online+texas+hold+em there are uk and com pages ranking in the results for pokerstars & how do I fix it? Thank you!
Local Website Optimization | | charliegirlcontent1 -
Seeking advise about my new landing pages for different cites
I have just created 6 new location landing pages for my Dallas insurance agency. Each one is for a different city, but I have a feeling I did it wrong 😞 Because my site is rather large, I put two different lines of insurance on each page. Homeowners insurance and business insurance. Now I'm wondering if I should of done 12 different pages? i.e **1 city + 1 product = 1 page ** Here's one of the new pages: http://thumannagency.com/personal-insurance/frisco-insurance I'm having a guess here, but would it be better if the Navigation was; thumannagency.com/personal-insurance/frisco thumannagency.com/business-insurance/frisco ??? Thank you so much in advance!!
Local Website Optimization | | MissThumann0 -
Repairing SEO issues on Different Platforms
I work for a car dealership in Southern California and have been tasked with a seemingly impossible task. They would like for me to remove Title Tags, Duplicate Content, Descriptions, and get all other SEO issues in order. The concerns I have rank in this order: 1. Remove Duplicate Metadata: When the platform spits out new pages they use template Title/Description/Keywords and we are not always informed of their addition. There are also somewhere near 1K vehicles in the inventory that are being accused of duplicate content/Metadata. The fix that I have been spit balling is adding canonical - No Follow to these pages. I am not sure that this is the best way forward, but would appreciate the feedback 2. Duplicate Content: Most of the information is supplied from the manufacturer so we have been sourcing the information back to the manufacturers site. They are showing up on random "SEO Tools" pulls as harmful to the site. Although we use the Dealers name and local area, the only way I can assume to get the heat off and possibly fix any negative ramifications is to once again use a Canonical Tag - No Follow to these pages. 3. Clean up Issues: Most of the other issues I am finding is when the website platform dumps new pages to the site without notice and creates more then 1k pages that are coming with duplicate everything. Please provide with any assistance you can.
Local Website Optimization | | BBsmyth0 -
Schema training/resources for local SEO?
I am currently in the process of apply schema for dozens of clients (many are large retailers). Although I am not a developer, I do know the basics of schematic markup & structured data. I do work with a development team and I'm trying to provide them with schema application best practices. Obviously there are many good articles/blog posts out there about schema. However I'm looking for a more substantial training course, webinar or resource website about schema application. Does anybody have any good recommendations?
Local Website Optimization | | RosemaryB0 -
Understand how site redesign impacts SEO
Hi everyone, I have, what I think, is kind of a specific question, but hoping you guys can help me figure out what to do. I have a client that recently changed their entire website (I started working with them after it happened, so I can't comment on what the site was like as far as content was before). I know they were using a service that I see a lot of in the service industry that aim to capitalize on local business (i.e. "leads nearby" or "nearby now") by creating pages for each targeted city and I believe collecting reviews for each city directly on the website. When they redesigned their website, they dropped that service and now all those pages that were ranking in SERPs are coming back as 404s because they are not included in the new site (I apologize if this is getting confusing!) The site that they moved to is a template site that they purchased the rights to from an already successful company in their same industry, so I do think the link structure probably changed, especially with all of the local pages that are no longer available on the site. Note: I want to use discretion in using company names, but happy to share more info in a private message if you'd like to see the sites I am talking about as I have a feeling that this is getting confusing 🙂 Has anyone had experience with something like this? I am concerned because even though I am targeting the keywords being used previously to direct content to the local pages to new existing pages, traffic to the website has dropped by nearly 60% and I know my clients are going to want answers-- and right now, I only have guesses. I am really looking forward to and so greatly appreciate any advice you might be able to share, I'm at a bit of a loss right now.
Local Website Optimization | | KaitlinNS0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0