Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
-
Hi guys,
I have a client who has 3 websites all for different locations in the same state in Australia.
Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area.
He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon.
My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area.
Also this way we have 1 domain to optimise and build domain authority for.
Has anyone else come across this and is my solution the best for this?
Thanks!
Jon
-
HI Sanket,
The poster is not talking about international content, but three locations within the same state.
If you quote from another source, it's helpful to let the community know the source of the post so they can find more information (and is a courtesy to the blog author as well). For more information about the first main paragraph, http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world is the source article from Dr. Pete.
As for spinning content, I would not advise spun content.
-
hi,
There are two solution for the Local SEO :
If you want to create pages for targeting geographic location then you have to post unique content for every geographic region and If you aren’t willing to make that investment, then don’t create the pages. They’ll probably backfire. This site is best example for your problem: http://bandelal.com/
-
I second this. I have dealt with clients that have done the same thing. Have one domain (ie companyname.com) and use the footer to display the contact info for each location. Try not to promote any office more than the other and there shouldn't be an issue. In my experience the visitors will not care.
Do really awesome local SEO and set up places pages for each location, etc.
-
How would this client feel about a single website with a top banner that proclaimed:
Stores in Sydney, Canberra and New Castle!
I am willing to bet one month's pay that a single site will be more successful than three separate sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0