.com vs .co.uk
-
Hi, we are a UK based company and we have a lot of links from .com websites. Does the fact that they are .com or .co.uk affect the quality of the links for a UK website?
-
The domain .com or .co.uk does not matter unless it is quality site. If it is a quality website it can help you to grow.
-
Hi there,
In short, it won't matter too much as long as the links that you're getting are good quality and as relevant as possible to what you do. It's natural to get links from a wide range of domain extensions and there are perfectly legitimate reasons for this. For example, you may get a link from a .com website which may actually be for a UK based website who have decided to use this domain extension instead of a .co.uk.
Overall, if they look like good quality and relevant, don't worry
With that said, there is a concept around getting "local" links meaning more if you're serving a particular region or market. So it may make a small difference but it's not something that I'd worry loads about.
Hope that helps!
Paddy
-
There are high DA domains like google.com. So, there is nothing wrong with having backlinks from .com domains. It's always recommended to check the authority and spam score for every linking domain regardless of the domain extension. The point is that ".com" is not necessarily lower quality than ".co.uk".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Should we host our magazine on a subdomain of E-com site or its own domain?
We host a online fashion magazine on a subdomain of our e-commerce site. Currently we host the blog which is word press based on a subdomain ex: stylemag.xxxxxxx.com First question is are all the links from our blog considered internal links? They do not show in the back links profile. Also would it be better to host this on its own domain? Second question Is my main URL getting credit for the unique content published to the blog on the subdomain and if so is it helping the overall SEO of my website more then if it and the links were hosted on its own wordpress.com
Intermediate & Advanced SEO | | kushvision0 -
Location.href vs href?
I just got off a Google Hangout with John Mueller and was left a little confused about his response to my question. If I have an internal link in a div like widgetwill it have the same SEO impact as widget John said that as you are unable to attribute a nofollow in an onclick event it would be treated as a naked link and would not pass pagerank but still be crawled. Can anyone confirm that I understood it correctly? If so should all my links that have such an onclickevent also have an html ahref in the too? Such as widget Many times it is more useful for the customer to click on any area of a large div and not just the link to get to the destination intended? Clarification on this subject would be very useful, there is nothing easily found online to confirm this. Thanks
Intermediate & Advanced SEO | | gazzerman10 -
Broken sitemaps vs no sitemaps at all?
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file. The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up. I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt. I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes. Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
Intermediate & Advanced SEO | | edmundsseo0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
URL Structure - Keywords vs. Information Architecture/Navigation
I'm creating the URL structure for an ecommerce site and was wondering if it's better to structure my URLs according to the most popular way people word their key phrases or by what makes most sense from a navigation perspective. Let's say I'm selling clothing (I'm not, just an example). I want the site to be open enough so a user can navigate by Person Type (Men's, Women's, Children's), Clothing Type (Shoes, Shirts, Hats), and Brands (Nike, Reebok, adidas). My gut and past experience say to structure the URLs from the least specific to the most specific: mysite.com/mens/shoes/nike But I know "men's Nike shoes" is searched for more than "men's shoes Nike", which would render this URL: mysite.com/mens/nike/shoes I know mysite.com/mens-nike-shoes would be best, but the folders setup is what I have to work with. So which is best for SEO? URLs that play to the structure of the most searched for key phrases? Or URLs that follow the information architecture/navigation of a site? Nate
Intermediate & Advanced SEO | | rball10 -
Get-targeted homepage for users vs crawlers
Hello there! This is my first post here on SEOmoz. I'll get right into it then... My website is housingblock.com, and the homepage runs entirely off of geo-targeting the user's IP address to display the most relevant results immediately to them. Can potentially save them a search or three. That works great. However, when crawlers frequent the site, they are obviously being geo-targeted for their IP address, too. Google has come to the site via several different IP addresses, resulting in several different locations being displayed for it on the homepage (Mountain View, CA or Clearwater, MI are a couple). Now, this poses an issue because I'm worried that crawlers will not be able to properly index the homepage because the location, and ultimately all the content, keeps changing. And/or, we will be indexed for a specific location when we are in fact a national website (I do not want to have my homepage indexed/ranked under Mountain View, CA, or even worse, Clearwater, MI [no offence to any Clearwaterians out there]). Of course, my initial instinct is to create a separate landing page for the crawlers, but for obvious reasons, I am not going to do that (I did at one point, but quickly reverted back because I figured that was definitely not the route to go, long-term). Any ideas on the best way to approach this, while maintaining the geo-targeted approach for my users? I mean, isn't that what we're supposed to do? Give our users the most relevant content in the least amount of time? Seems that in doing so, I am improperly ranking my website in the eyes of the search engines. Thanks everybody! Marc
Intermediate & Advanced SEO | | THB0