Geotargeting issue
-
Hi,
So ive just starting working on a travel website and noticed that the .com website outranks the com.au in Australian SERPS, even though the .au site has been geotargeted (In GWT) for Australia.I also geotargeted the .com website to Canada (the primary place of business).
Is this advisable? Will this affect rankings?
-
Just because there is a reindex, does not guarantee that the rankings change that quickly. You should watch it and see. One thing to remember, make sure you are using a non personalized search when looking or run the rankings in moz.
I would watch it over a few weeks and see if you see movement.
-
So a follow up on the .com outranking the .com.au for our company- both the geotargeting parameters have been set .com (Canada) and .au (Australia). However nothing has changed in the rankings. Also checked GWT, both sites have definately been crawled since.
Any advice on what else to do?
-
I would be surprised if it took more than a few days. Given your site has been up and ranked, wont take long to reindex.
-
Thanks for the response Robert,
Any idea how long it takes for Google to react to the changes?
-
Tourman
It is likely the change in GWT to .com being set as CA preference will affect the ranking in AU of the .com. If the .com was not on a CA server it is more likely. Since you want the .com to rank in CA primarily and you have a .com.au site, assuming you have covered your bases with regards to other countries you want to rank in, you are going in the correct direction.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Discovered - currently not indexed issue
Hello all, We have a sitemap with URLs that have mostly user generated content. Profile Overview section. Where users write about their services and some other things. Out of 46K URLs, only 14K are valid according to search console and 32K URLs are excluded. Out of these 32K, 28K are "Discovered - currently not indexed". We can't really update these pages as they have user generated content. However we do want to leverage all these pages to help us in our SEO. So the question is how do we make all of these pages indexable? If anyone can help in the regard, please let me know. Thanks!
Technical SEO | | akashkandari0 -
404 issues
Hello, Some time ago, something like a month and a half) I have removed all 404 errors from the google index and the webmaster tools have removed them already, however yesterday moz found the same 404 errors that i have removed from indexing (tose pages are deleted or redirected by the site developer). What could be an issue here and why webmaster tools are not registering those 404 errors but moz analytics does. And the other question is if those pages do not exist can i track where the placed? I tried dowloading moz crawl test, but the refering source was not provided. I would highly appreciate anyones help. Thank you
Technical SEO | | rikomuttik0 -
Hreflang and possible duplicate content SEO issue
| 0 <a class="vote-down-off" title="This question does not show any research effort; it is unclear or not useful">down vote</a> favorite | Hey community, my first question here 🙂 Imagine there is a page with video, it has hreflang tags setup, to lead let's say German visitors to /de/ folder... So, on that German version of page, everything like menus, navigation and such are in German, but the video is the same, the title of the video (H1 tag) is the same, <title></code></strong> and <strong><code>meta description</code></strong> is the same as on the original English page. It means that general (English) page and German version of it has the same key content in English.</p> <p>To me it seems to be a SEO duplicate content issue. As I know, Google doesn't think that content is duplicate, if it is properly translated to other language.</p> <p>Does my explained case mean that the content will be detected by Google as duplicate?</p> </div> </div> </td> </tr> </tbody> </table></title> |
Technical SEO | | poiseo0 -
Product Documentation Causing 23-40K issues
One of my biggest hurdles at my company is our Product Documentation library, which houses thousands of pages of publicly accessible and indexed content on old and new versions of our product. Every time a product name changes the URL changes, causing a 404, so I typically have 100s of 404s every few months from this site. It's housed off our main domain. We have 23,000+ Duplicate Pages, 40,000 missing meta descriptions, and 38,000 due to this library. It is not built the same as our main content, with page titles and meta descriptions, so everything is defaulted and duplicate. I'm trying to make a case that this is an issue, especially as we migrate our site next year to a new CMS. Does anyone have any suggestions for dealing with this issue in the short term and long term? Is it worth asking the owners of the section of content to develop page titles and meta descriptions on 40,000 pieces of content? They do not see the value of SEO and the issues this can cause. It needs to be publicly accessible, but it's not highly ranked content. It's really for customers who want to know more about the product. But I worry it is hurting other parts of our site, with the absurd amount of duplicate content, meta, and page title issues.
Technical SEO | | QAD_ERP0 -
Moz Crawl Diagnostic shows lots of duplicate content issues
Hi my client's website uses URL with www and without www. In page/title both website shows up. The one with www has page authority of 51 and the one without 45. In Moz diagnostic I can see that the website shows over 200 duplicate content which are not found in , e.g. Webmaster. When I check each page and add/remove www then the website shows the same content for both www and no www. It is not redirect - in search tab it actually shows www and then if you use no www it doesn't show www. Is the www issue to blame? or could it be something else? and what do I do since both www URL and no-www URL have high authority, just set up redirect from lower authority URL to higher authority URL?
Technical SEO | | GardenPet0 -
Setting up addon domains properly (bonus duplicate content issue inside)
A new client of mine is using 1and1 hosting from back in the dark ages. Turns out, her primary domain and her main website (different domain) are exactly the same. She likes to have the domains names of her books, but her intention is to have it redirect to her main site. Unfortunately, 1and1's control panel is light years behind cpanel, so when she set up her new domains it just pointed everything to the same directory. I just want to make sure I don't make this up, so please correct me if I'm wrong about something. I'm assuming this is a major duplicate content deal, so I plan to create a new directory for each add-on domain. Since her main site is an add-on itself, I'll have to move all the files into it's new home directory. Then I'll create an htaccess file for each domain and redirect it to her main site. Right so far? My major concern is with the duplicate content. She's had two sites being exactly the same for years. Will there be any issues leftover after I set everything up properly? Is there anything else I need to do? Thanks for the help guys! I'm fairly new to this community and love the opportunity to learn from the best!
Technical SEO | | Mattymar0 -
Https enabled site with seo issues
Hello, Is there a problem with seo bots etc to crawl and rank my wesbite well if the entire site is https enabled? We have a sign in button which results on the next page being https along with the main homepage and all other pages are https enabled. Any major setbacks to the seo strategies? How do I overcome these issues?
Technical SEO | | shanky10 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0