Dupicated Site Issues?
-
We are launching a new site for the Australian market and the URL will just be siteAU.com.
Currently the tech team (before we came on board) has it setup with almost exactly the same content (including the site css/nav/structure etc). Some product page content is slightly different, and category pages have different product orders, plus there are location pages that are specific to AU, but otherwise it's the same.
The original site: site.ca has been around for 6+ years, with several thousand pages and solid organic ranking (though the last few months have dropped )
Will the new AU site create issues for the original domain? We also have siteUSA.com which follows the same logic and has been live for a while.
-
if you want to rank competitively in the US and AU then I strongly recommend local domains.
In my experience, subfolders on a single domain are less effective then local domains. The above solution can work and you are at an advantage that your CA site is established and is the "parent" site.
It is more work to maintain separate domains, but in my experience of managing .com sites, we have always faced a challenge because despite the target setting in WMT, Google still struggles to rank the UK site in the UK over the US site in many instances because it is on a .com and because there is a US version of the site.
-
Could work well. We would have some additional content challenges, but still consolidated. e.g. now on the same domain we would have thousands of more pages for the city-specific areas. But we would have those anyway on separate domains. My question though is if we would face any more local ranking challenges. e.g. within canada we may already rank for 'plumbing service' for example. If everything is consolidated, we wouldn't rank 2X for the same search term, but if on different domains, we could - in the respective countries. Local modifiers like City Plumbing Service could still work.
-
To me this seems like the optimal solution. With this setup you are not only eliminating the possible duplicate content issues, but you still have information for each region on the site.
Or you could use your country specific domains as landing pages, and optimize them for their home country while still hosting the majority of your content on your main domain.
-
Thanks for the input. One alternative we are pondering is keeping everything on the primary domain with sections for the countries we serve. So primary is Canada then we also serve AU and USA. There may be some branding issues, along with technical challenges with our order processing, but it would probably allow us to better control content.
Thoughts on this?
-
If you set the targeting in Webmaster Tools to AU only and the others to their corresponding markets then you are unlikely to face any duplicate content penalties.
Additional measures you need to consider are:
Focus the majority of link building and PR to respective markets i.e. try and get a higher ratio of links from local sites
Put local physical addresses on each site e.g. on the AU site have an address in Australia.Generally speaking, you are still going to have some issues with your non AU sites outranking your AU site on Google AU, as do sites in the UK, but if you take the measures above then you are sending a strong signal to the search engines that you are not trying to spam them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Issue with site not being properly found in Google
We have a website [domain name removed] that is not being properly found in Google. When we run it through Screaming Frog, it indicates that there is a problem with the robot.txt file. However, I am unsure exactly what this problem is, and why this site is no longer properly being found. Any help here on how to resolve this would be appreciated!
Intermediate & Advanced SEO | | Gavo1 -
I have an authority site with 90K visits per month. Now I have to change from non www to www. Will incur in any SEO issues while doing that? Could you please advice me on the best steps to follow to do this? Thank you very much!
Because I want to increase site speed, Siteground (my hosting) suggested I use Cloudflare Plus which needs my site to have www in order to work. I'm also using a cloud hosting. Im a bit scared of doing this, and thus decided to come to the community. I used MOZ for over 6 months now and love the tool. Please help me make the best possible decisions and what steps to follow. It would be much appreciated. Thank you!
Intermediate & Advanced SEO | | Andrew_IT0 -
Why my site it's not being indexed?
Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Noindex Mobile Site?
So I wanted to get everyone's opinion. Have a client in online retail on ASP and their developers built a mobile site a while back before we took the client on. For the sake of this post, just assume, resources are limited anddevelopers are not good (constantly break things we request to get fixed). They never installed analytics on the mobile site, so all I have to go off of is referral data on the main stores GA account for m.example.com However if I look to see what is indexed by doing site:m.example.com am not seeing many pages. The mobile site has a ton of internal links in GWT and am questioning its negative impact as there are no canonicals, no mobile sitemap present. In the ideal world, I would implement proper Mobile SEO practices but given the resources of no dev budget and devs not being good, I was thinking about noindexing the mobile site since I can RDP into the site and access robots. Thoughts?
Intermediate & Advanced SEO | | Sean_Dawes0 -
This site got hit but why..?
I am currently looking at taking on a small project website which was recently hit but we are really at a loss as to why so I wanted to open this up to the floor and see if anyone else had some thoughts or theories to add. The site is Howtotradecommodities.co.uk and the site appeared to be hit by Penguin because sure enough it drops from several hundred visitors a day to less than 50. Nothing was changed about the website, and looking at the Analytics it bumbled along at a less than 50 visitors a day. On June 25th when Panda 3.8 hit, the site saw traffic increase to between 80-100 visitors a day and steadily increases almost to pre-penguin levels. On August 9th/10th, traffic drops off the face of the planet once again. This site has some amazing links http://techcrunch.com/2012/02/04/algorithmsdata-vs-analystsreports-fight/
Intermediate & Advanced SEO | | JamesAgate
http://as.exeter.ac.uk/library/using/help/business/researchingfinance/stockmarket/ That were earned entirely naturally/editorially. I know these aren't "get out of jail free cards" but the rest of the profile isn't that bad either. Normally you can look at a link profile and say "Yep, this link and that link are a bit questionable" but beyond some slightly off-topic guest blogging done a while back before I was looking to get involved in the project there really isn't anything all that fruity about the links in my opinion. I know that the site design needs some work but the content is of a high standard and it covers its topic (commodities) in a very comprehensive and authoritative way. In my opinion, (I'm not biased yet because it isn't my site) this site genuinely deserves to rank. As far as I know, this site has received no unnatural link warnings. I am hoping this is just a case of us having looked at this for too long and it will be a couple of obvious/glaring fixes to someone with a fresh pair of eyes. Does anyone have any insights into what the solution might be? [UPDATE] after responses from a few folks I decided to update the thread with progress I made on investigating the situation. After plugging the domain into Open Site Explorer I can see quite a few links that didn't show up in Link Research Tools (which is odd as I thought LRT was powered by mozscape but anyway... shows the need for multiple tools). It does seem like someone in the past has been a little trigger happy with building links to some of the inner pages.0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
How would you fix this site?
We're currently in the IA and design phase of rolling out a complete overhaul of our main site. In the meantime I've been doing some SEO triage, but I wanted to start making a longer term plan for SEO during and after the new site goes up. We have a pretty decent domain authority, and some quality backlinks, but we're just getting creamed in the SERPs. And so on to my question: How would you fix this site? What SEO strategy would you employ? http://www.adoptionhelp.org Thanks!
Intermediate & Advanced SEO | | AdoptionHelp0