SEO geolocation vs subdirectories vs local search vs traffic
-
My dear community and friends of MOZ, today I have a very interesting question to you all.
Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you.
I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure.
The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located.
They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site.
I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain
| |
| | |
| | |
| | |
| | |
| | |
| | |
| | |They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work
-
Hi.
Normally changing the content but not the URL via scripts is not a good idea in International SEO. A good example of this use is Dropbox, which geolocalize different languages but does not change the URL of its homepage, which remains always dropbox.com.
Regarding Sensacine.com (a site I know well, because I live in Spain), the success it has also in Spanish Latin America is quite surely due to its very good link profile, and surely not to meta tags like:
because these kind of geotargeting metas are not taken into consideration by Google, but only by Yahoo! and Bing (and also for them they are not the most relevant geotargeting signals).
-
I can't seem to figure out your specific question, but I do agree that one news page with dynamic, location-based links will be a more challenging site to rank. Obviously, engines are accustomed to sites for multiple languages and countries. The extra subdirectory is of no real concern to the engines...they "rank pages, not sites" right?
Of course, managing various news pages could be a technical headache, which is prompting this change. That said, I think your approach is going to be easier for search engines (those dynamic links, ouch!). Sadly we have to balance SEO and such with business needs...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to "protect" yourself from non-local traffic?
I'll start with the story, but the main question is at the bottom. Feel free to scroll down :-). I've got good news and bad news regarding a client of mine. It's a service area business that only serves one metropolitan area. We've got a great blog with really valuable content that truly helps people while firmly establishing my client's industry expertise. As a result, local traffic has spiked and the company generates more leads. So that's the good news. The bad (bad-ish?) news is that the client also gets tons of traffic from outside the service area. Not only that, people are calling them all the time who either live in a different state and don't realize that the company isn't local to them or are located out of state but are calling for free advice. On one hand, the client gets a kick out of it and thinks it's funny. On the other hand, it's annoying and they're having to train all their intake people to ask for callers' locations before they chat with them. Some things we're doing to combat this problem: 1. The title tag on our home page specifies the metro area where we're active. 2. Our blog articles frequently include lines like, "Here in [name of our city], we usually take this approach." 3. There are references to our location all over the site. 4. We've got an actual location page with our address; for that matter, the address is listed in the footer on every page. 5. The listed phone number does not begin with 800; rather, it uses the local area code. 6. All of our local business listings, including our Google My Business listing, is up to date. 7. We recently published a "Cities We Serve" area of the site with highly customized/individualized local landing pages for 12 actual municipalities in our metro region. This will take some time to cook, but hopefully that will help. "Cities We Serve" is not a primary navigation item, but the local landing pages are situated as such: "About Us > Cities We Serve > [individual city page]" **Anyway, here's my main question: **In light of all this, is there any other way to somehow shield my client from all this irrelevant traffic and protect them from time-wasting phone calls?
Local Website Optimization | | Greenery0 -
"spammy structred data" search console message
Hey gang, I want to first say thank you to anybody that tries to help me with this. I'm not quite sure where to start. So first I get the message in search console for my locksmith website that it looks like I have some spammy structured data. I remembered that for one landing page I did have the stars short code on it and it was displaying the stars. Well, I went and looked and they were indeed no longer showing. So I simply deleted the shortcode, but I wanted to do a thorough check of my landing pages, one by one. Now I have project supremacy on my wordpress site, which I stand by, it's a solid product and I have been able to make my per page schema look really good, zero errors. So I went through each page that had errors on it and fixed them and sent it all back into google for 'reconsideration'. BUT today (sorry this is getting long) I look in my search console and I see that ALL of my blog posts have errors on them. Something wrong with the hentry. As I test one of the posts in structured data tester tool I see 4 errors and 4 warnings. I don't have the author displaying which is not true and some other things. But I have never ever tried to schema any of my blog posts and there is ZERO site wide schema, I already checked. Where is this bad schema living, and could that be the reason for the spammy stuff? Thank you crew!!! mwDd8
Local Website Optimization | | Meier0 -
Question About Local SEO
Hey all, If a business operates in one city but works with associated organizations across multiple regions how would this impact a local SEO campaign? For example, a transportation company is located in Texas but services the Northwest and New England by outsourcing to smaller transportation companies in each of those regions. Would it be wise to create pages for each region they service on their website and then break that down in further into specific cities? Also, would it be worth targeting local search terms even though specific cities are serviced by the associated organizations and not the parent company itself? Thanks in advance, Andrew
Local Website Optimization | | mostcg0 -
Is there an SEO benefit to using tags in WordPress for my blog posts?
We have locations across the US and are trying to develop content so that we rank well for specific keywords on a local level. For instance, "long tail keyword search in state" or "long tail keyword search near 76244", etc. The goal is to develop those content pages via blogs to rank for those keywords. We are using Yoast and will be optimizing each post using that tool. My questions are: 1. Are there any benefits to adding a long list of tags to each post?
Local Website Optimization | | Smart_Start
2. If yes, do I need to limit the number of tags?
3. Do we need to block the indexing of yoast to those tags and categories for duplicate content issues? Any insight on the best way to optimize these blog posts with the use of tags or other avenues would be greatly appreciated.0 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Expert Advice Needed: Single Domain vs Multiple Domain for 2 Different Countries?
Hi MOZers, We are looking for some advice on whether to have a single TLD(.com) or 2 separate domains (.ca) & (.com) Our website will have different products & pricing for each of US users(.com) and Canada users(.ca). Since, we are targeting different countries & user groups with each domain - we are not concerned about "duplicate content". So, does it make more sense to have a single domain for compounding our content marketing efforts? Or, Will it be more beneficial to have seperate domains for the geo-targeting benefits on Google.CA & Google.COM? Looking forward to some great suggestions.
Local Website Optimization | | ScorePromotions0 -
Local franchise seo strategy. what could be the best practice?
Hello what Could be the best practice of seo and website optimization for a franchise company. Business model: Lets say, a company(company.com) situated in a country having stores in different cities (more than 2 stores in some), provides n number of services depending on store's location. Physical addresses for some stores are available and new stores shall be launched in future. But, the seo and website pages are needed for those locations at the moment as well. If I choose a sub folder, to give each store a URL. This is how it should look like Country level pages company.com, company.com/service1/ _ _ _ company.com/serviceN/ City level pages company.com/city1/, company.com/city1/location1/ , company.com/city1/location2/, company.com/city2/ , company.com/city3/ Q1) In case I make each service page specific to a store location for eg. company.com/city1/service1/, it will create duplicate content issue because content of company.com/city1/service1/ and company.com/service1/ shall be 60% same, except for **meta title,description and contact detail in footer. ** So, the question arises that shall i give canonical to country level main services page i.e company.com/city1/service1/ canonical to company.com/service1/ as it is very hard to make unique content for same services page. Q2) Or Do I need to rework on my complete website design and seo strategy?
Local Website Optimization | | Technians1