How to get traffic from a particular Geographical region?
-
Our company is based out of India and has a web site with .in domain ; however our target customers are from North America and Australia.
The problem is we get as high as 70% of organic traffic from India.
This 70% traffic from India has little use to us. Possibly because we have ”.in “ domain the Google local search is active.
How to reverse this situation; I mean we are looking for more traffic from across the globe except India.
Any suggestions ?P.S. Changing domain from .in to .com is not an option as its the part of our brand advertised for last 7 years
-
Add Countries or Suburbs in your keyword For Example Keyword 1 in Australia or Keyword 1 in Melbourne it would let you target those who actually search to get these services from the places you want to target for example seo services in Melbourne or seo services Australia.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic referenced canonical pages based on IP region and link equity question
Hi all, My website uses relative URLs that has PHP to read a users IP address, and update the page's referenced canonical tag to an region specific absolute URL for ranking / search results. E.g. www.example.com/category/product - relative URL referenced for internal links / external linkbuilding If a US IP address hits this link, the URL is the same, but canonicalisation is updated in the source to reference www.example.com**/us/**category/product, so all ranking considerations are pointed to that page instead. None of these region specific pages are actually used internally within the site. This decision was done so external links / blog content would fit a user no matter where they were coming from. I'm assuming this is an issue in trying to pass link equity with Googlebot, because it is splitting the strength between different absolute canonical pages depending on what IP it's using to crawl said links (as the relative URL will dynamically alter the canonical reference which is what ranking in SERPs) Any assistance or information no matter how small would be invaluable. Thanks!
Intermediate & Advanced SEO | | MattBassos0 -
Need Help - Lost 75% Of Traffic Since May 2018
Sorry to go in-depth here, but want to give all available information. We went live late April 2018 with our two websites in Shopify (moved from Magento, same admin, different storeviews...which we find later to cause some issues). Both of these websites sell close to the same products (we purchased a competitor about 5 years ago, which is why we have two). The nice thing is that they do almost identical amounts in sales. They have done very well for years, especially in the last two years. Well, the core algo update around May 22nd-24th 2018 happened and wiped out about 65% of our Google traffic for one website (MySupplementStore.com). And this latest update, wiped out another 20%. I couldn't figure out why this would have happened, because we were very cautious about keeping things separate, unique descriptions etc. So I did some digging and this is what I found: The reviews we migrated over from Magento somehow were combined and added to both websites. This is something I didn't notice. I had this resolved a month ago so that each site's reviews are now only on that website. Our blog section was duplicated across both websites during the migration. Again, something I didn't notice, as we have close to over 1,000 blog posts per site. This was resolved two weeks ago. As I was looking more, I found that the last 6 months, a person working for us (for 3 years), started writing descriptions and pasting them on both websites, instead of making them unique to each website. I trusted her for years, but I think she just got lazy. She quit about a month before the migration as well. We are currently working on this, but its been taking awhile because we have over 5,000 products on each site and have no idea which ones are duplicates. I did also notice: Site very slow when checking site speed tools. Working on that this week. When I take snippets of text or do searches, many times it shows up in omitted results. No messages in Google Webmaster Tools So the question is... Do you think it is the duplicate content issues that caused the drop? Our other site is Best Price Nutrition, which didn't see a big drop at all during that update. If not, any other ideas why?
Intermediate & Advanced SEO | | vetofunk0 -
Ipad Sales & Traffic Improvement for my Ecommerce site
Do you guys know any tool or software which provides follow things for my ecommerce site? Real Time/ next day data for ipad traffic Real Time/ next day data for ipad urls visited Read time/ next day data for ipad Page rendering load time for all the urls separately Real Time/ next day data for ipad network load time for all the urls separately Real Time/ next day data for ipad dom processing time for the all the urls separately Real Time/ next day data for ipad request queuing load time for all the urls separtely Real Time/ next day data for ipad web application load time for all the urls separtely Real Time/ next day data for ipad total load time for each url Real Time/ Next day data for ipad timestamp i.e Time of each url being accessed by the visitor Real Time/ next day data for ipad visitor city Real Time/ next day data for ipad visitor country code Real Time/ next day data for ipad visitor duration on that page Real Time/ next day data for ipad visitor user agent name foreg chrome, IE, safari, firefox etc Real time/ next day data for ipad visitor user agent OS foreg. ipad only Real time/ next day data for ipad user agent version foreg. ipad 8.0, ipad 6.0, ipad air, ipad ratina, ipad mini etc Real time/ next day data for ipad visitor for each url session trace in water fall like backend time, dom processing, page load, waiting on ajax, interactions of visitors etc Real time/ next day data for ipad visitor for each url with total request for each page. Real time/ next day data for ipad visitors for each url with javascript error on the page and javascript url plus stake track of that error. Real time/ next day data for ipad visitors for each url with ajax error on the page and ajax url plus stake track of the error Real time/ next day data for ipad visitors for each and every url where each and every request time taken in waterfall layout. Real time/ next day data for ipad visitors funnel visiualization tracking Real time/ next day data for ipad visitors transcations tracking. Please note that all above data also require day wise, country wise, previous days and month, model wise sorting, pagination feature, etc. waiting for your reply Regards, Mit
Intermediate & Advanced SEO | | mit0 -
Getting Rid Of Spammy 301 Links From An Old Site
A relatively new site I'm working on has been hit really hard by Panda, due to over optimization of 301 external links which include exact keyword phrases, from an old site. Prior to the Panda update, all of these 301 redirects worked like a charm, but now all of these 301's from the old url are killing the new site, because all the hyper-text links include exact keyword matches. A couple weeks ago, I took the old site completely down, and removed the htaccess file, removing the 301's and in effect breaking all of these bad links. Consequently, if one were to type this old url, you'd be directed to the domain registrar, and not redirected to the new site. My hope is to eliminate most of the bad links, that are mostly on spammy sites, that aren't worth linking to. My thought is these links would eventually disappear from G. My concern is that this might not work, because G won't re-index these links, because once they're indexed by G, they'll be there forever. My fear is causing me to conclude I should hedge my bets, and just disavow these sites using the disavow tool in WMT. IMO, the disavow tool is an action of last resort, because I don't want to call attention to myself, since this site doesn't have a manual penalty inflected on it. Any opinions or advise would be greatly appreciated.
Intermediate & Advanced SEO | | alrockn0 -
What are your best moves if you want to get your traffic and rankings back for a specific keyword?
Hi all We are server and website monitoring company for over 13 years and I dare to say our product evolved and mastered over the years. Our marketing not so much. Most of our most convertible traffic came from the keyword "ping test" with our ping test tool page, and for the first 10 years we have been positioned 1-3 in Google.com so it was all good. The last two years we have been steady on positioned 8-9, and since 7-30-13 we are on the second page. We have launched a blog in 2009 at http://www.websitepulse.com/blog, and post 2-3 times a week, and are working on new website now, and my question is what is your advice in our situation? Aside from providing fresh content and launching a new website is there anything specific we could do at this stage to improve our position for "ping test"? Thanks Lily
Intermediate & Advanced SEO | | wspwsp0 -
Targeting multiple local geographic areas
I have a local client in the automotive service sector. Actually, I deal with several local, service-oriented businesses, so I am hoping to apply the knowledge gained from this question to several cases. And, I am not concerned with Google Places or Adwords in this case - I have those dialed-in just fine - I am referring to a methodology applicable to organic search results. While it has been simple enough to target one or two local geographic regions (e.g. cities/towns) for specific key phrases related to my client's industry by adding geo-modifiers to the mix. I need to develop and apply a method to target multiple outlying towns (up to 12 within a 30mi radius) near my client's place of business - without generating pages of duplicate content (e.g. automotive service town-1, automotive service town-2, automotive service town-3, etc.) Would someone more experienced in this area be willing to shed some light upon my dilemma? Thanks!
Intermediate & Advanced SEO | | SCW0 -
How many articles should I write per day & how many backlinks should I get per day to be natural!
hey.. I"m working in review blog one day per 1 or 2 weeks and I post up to 6 articles one time; is it unnatural for SEO ? how many articles should I post in blog per day? another question..how many backlinks should I get to just one post? I'm using Magic Submitter software to get help but I don't get more than 50 backlinks one time..what's real number of backlinks should I get and for how much time to be 100% natural for Google? any helpful info about backlinks techniques worth to hear..thnx
Intermediate & Advanced SEO | | akitmane0 -
How to get a news article / post to show up in a google trend for your keyword?
Does anyone know how google selects the news articles it displays in google trends? EX: http://www.google.com/trends?q=glitch+hop%2C+dubstep&ctab=0&geo=all&date=all&sort=0 See how dubstep has a couple posts that show up when searched in google trends? these are different than regular SERPS as far as i can tell. Does anyone know how google selects them?
Intermediate & Advanced SEO | | adriandg0