Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we鈥檙e not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the point of an EU site?
-
Buongiorno from 18 degrees C Wetherby UK

On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk
Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment?
When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do?
Any ideas please,
Cioa,
David -
The .eu domain termination is a generic, hence it is not bound to geo-targeting on Google Webmaster Tool. In that sense, it is an alternative to the .com domain termination, if the .com is not available.
It was created by the European Union has a way to "communicate" that the business owning the domain has an European nature and that it is based on a nation of E(uropean) U(nion) and that is primary market is the EU.
From an SEO point of view, it doesn't offer any really advantage with respect any other generic domain name:
-
you can't geo-target more countries with a single domain name
-
you can't geo-target political regions (or continents).
Hence, it is good to have it for defending your brand, and to use it if .com (or .net) have been already taken. But if you have a .com, then it is better to redirect the .eu to it.
-
-
I had that argument with a client once and i manage to persuade them not to go ahead with it as they had .com and .co.uk .
.com is international so i think it did make sense not to use .eu but in your case, if you don't have .com then probably you need to look at which countries you want to target. If your client is UK based and if they target client from all around EU, then it might make sense to use .eu as you have very little chance to target someone in Italy with co.uk or .com.fr domains. If you are targeting only UK and FR then, you don't need .eu. It will just duplicate your work.
-
Hello
I got the same situation with customers recently. The best I've found was to tell them:
"You're definitely right we cannot loose the .eu. So we gonna redirect it to the BESTDOMAINE.com as main website. So people will still find you and you'll get more customers and will keep the .eu."
Then you can provide some technical arguments explaining that if he sticks to this position he will loose business.
In my case it worked out

From 50掳 in Dubai

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind)聽 where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is聽concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
What鈥檚 the best tool to visualize internal link structure and relationships between pages on a single site?
I鈥榙 like to review the internal linking structure on my site. Is there a tool that can visualize the relationships between all of the pages within my site?
Web Design | | QBSEO0 -
Anyone using CloudFlare on multiple sites?
We are considering using CloudFlare as a CDN for a large group of sites. The fees are $5 to $200 depending on many factors. We tried the free trial on one site and were impressed with the results. I am wondering if any of you have any longer term experience with this and performance metrics, etc.
Web Design | | RobertFisher1 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 聽25 PAGES Low bounce rate 2. Types of Space聽(Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)聽
Web Design | | Kingalan1
15 PAGES聽Low bounce rate. 3. Blog聽(Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take聽
30 PAGES Medium/high bounce rate 4. Services聽(Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) 聽High bounce rate
3 PAGES 5. About Us聽(Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings聽(Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings聽(Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES 聽Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. 聽My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google 聽that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one:聽http://unused-css.com/聽 聽It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: 聽Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. 聽Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. 聽When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. 聽I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. 聽I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Footer backlinks for sites I've developed
I link back to my website via my company name on the footers of sites I develop.聽 Lately I've been changing this to my keyword and mixing and matching.聽 This has been done for new sites I create and old sites I've not seen any benefit so far after a couple of months. Most my clients are hosted on the same server as my main site that it links back to. 1. Is this a bad idea to link back on the same IP?
Web Design | | sanchez1960
2. Is footer backlinks to the main developer going to annoy Google?
3. Should I change my main site's server, will it help? All my competitors seem to do it and as far as I can tell they seem to get better results than I do.聽 Because I'm now changing them the reason I see no benefit? Thanks0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: 帽 and 贸. We have done our research around the web and realised that many of the top competitors for keywords such as Dise帽o Web (web design) and Aplicai贸n iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX:聽 http://www.twago.es/expert/Dise帽o-Web/Dise帽o-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0