Do I need to change my country og:locale to en_AE
-
Hi MOZ, I have a site that is aimed at the English speaking market of the United Arab Emirates. The language tag is currently set to lang="en-GB" and the og:locale also set to en_GB.
The domain is a .com and aimed at the whole world.
Should I be trying to target en-AE and en_AE for these tags instead of GB?
-
Ok cool thanks for the tips Oleg
-
You should utilize og:locale:alternative and have og:locale set to the country you want to present it to. see details here.
Overall, since its English in both scenarios, I don't think you need to worry about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to "protect" yourself from non-local traffic?
I'll start with the story, but the main question is at the bottom. Feel free to scroll down :-). I've got good news and bad news regarding a client of mine. It's a service area business that only serves one metropolitan area. We've got a great blog with really valuable content that truly helps people while firmly establishing my client's industry expertise. As a result, local traffic has spiked and the company generates more leads. So that's the good news. The bad (bad-ish?) news is that the client also gets tons of traffic from outside the service area. Not only that, people are calling them all the time who either live in a different state and don't realize that the company isn't local to them or are located out of state but are calling for free advice. On one hand, the client gets a kick out of it and thinks it's funny. On the other hand, it's annoying and they're having to train all their intake people to ask for callers' locations before they chat with them. Some things we're doing to combat this problem: 1. The title tag on our home page specifies the metro area where we're active. 2. Our blog articles frequently include lines like, "Here in [name of our city], we usually take this approach." 3. There are references to our location all over the site. 4. We've got an actual location page with our address; for that matter, the address is listed in the footer on every page. 5. The listed phone number does not begin with 800; rather, it uses the local area code. 6. All of our local business listings, including our Google My Business listing, is up to date. 7. We recently published a "Cities We Serve" area of the site with highly customized/individualized local landing pages for 12 actual municipalities in our metro region. This will take some time to cook, but hopefully that will help. "Cities We Serve" is not a primary navigation item, but the local landing pages are situated as such: "About Us > Cities We Serve > [individual city page]" **Anyway, here's my main question: **In light of all this, is there any other way to somehow shield my client from all this irrelevant traffic and protect them from time-wasting phone calls?
Local Website Optimization | | Greenery0 -
Need an Local SEO's expert opinion regarding a client trying to improve their rankings.
I have a business i'm working with right now who wants to improve their rankings in a very competitive legal niche. Are there any Local SEO gurus out there that would be willing to explain in a paragraph or two what's going wrong? Let me know if you'd like to help and I'll PM you the domain.
Local Website Optimization | | BrianJGomez0 -
Is this local guide best to follow?
Today I found below guide, Is this best guide to follow for the website and service pages content, layout design? http://www.ducttapemarketing.com/blog/guide-to-local-seo/
Local Website Optimization | | Michael.Leonard0 -
Title Tag, URL Structure & H1 for Localization
I am working with a local service company. They have one location but offer a number of different services to both residential and commercial verticals. What I have been reading seems to suggest that I put the location in URLs, Title Tags & H1s. Isn't it kind of spammy and possibly annoying user experience to see location on every page?? Portland ME Residential House Painting Portland ME Commercial Painting Portland Maine commercial sealcoating Portland Maine residential sealcoating etc, etc This strikes me as an old school approach. Isn't google more adept at recognizing location so that I don't need to paste it In H1s all over the site? Thanks in advance. PAtrick
Local Website Optimization | | hopkinspat0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Drastic changes in keyword rankings on a daily basis
Anybody ever seen keyword rankings for a site change drastically from day to day? I've got a client, a local furniture store, whose local keywords (furniture + city) rank consistently well without much change, but when it comes to broader keyword rankings (like "furniture" or "furniture store") in their zip code, they'll go from ranking at the top of Google one day to not being ranked at all the next (at least according to Raven Tools). My best guess is that it's just a reflection of personalized results from Google, but such a dramatic change day in and day out makes me wonder.
Local Website Optimization | | ChaseMG0 -
Can I use a state's slang term for local search?
Have a business located in Indianapolis, Indiana. The business name will be BusinessName Indy. The URL will be BusinessName-Indy.com Since I am using Indy instead of Indianapolis or Indiana, is Google's algorithm smart enough to match up local results to my site?
Local Website Optimization | | StevenPeavey1 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0