Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
-
Hi guys,
I have a client who has 3 websites all for different locations in the same state in Australia.
Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area.
He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon.
My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area.
Also this way we have 1 domain to optimise and build domain authority for.
Has anyone else come across this and is my solution the best for this?
Thanks!
Jon
-
HI Sanket,
The poster is not talking about international content, but three locations within the same state.
If you quote from another source, it's helpful to let the community know the source of the post so they can find more information (and is a courtesy to the blog author as well). For more information about the first main paragraph, http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world is the source article from Dr. Pete.
As for spinning content, I would not advise spun content.
-
hi,
There are two solution for the Local SEO :
If you want to create pages for targeting geographic location then you have to post unique content for every geographic region and If you aren’t willing to make that investment, then don’t create the pages. They’ll probably backfire. This site is best example for your problem: http://bandelal.com/
-
I second this. I have dealt with clients that have done the same thing. Have one domain (ie companyname.com) and use the footer to display the contact info for each location. Try not to promote any office more than the other and there shouldn't be an issue. In my experience the visitors will not care.
Do really awesome local SEO and set up places pages for each location, etc.
-
How would this client feel about a single website with a top banner that proclaimed:
Stores in Sydney, Canberra and New Castle!
I am willing to bet one month's pay that a single site will be more successful than three separate sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content : domain alias issue
Hello there ! Let's say my client has 2 webshops (that exists since long time, so many backlinks & good authority on both) : individuals.nl : for individuals (has 200 backlinks, let's say) pros.nl : exact same products, exact same content, but with a different branding intended to professionnals (has 100 backlinks, let's say) So, both websites are 99% identical and it has to remain like that !!! Obviously, this creates duplicate content issues. Goal : I want "individuals.nl" to get all ranking value (while "pros.nl" should remain accessible through direct access & appear on it's own brand queries). Solution ? Implement canonical tags on "pros**.nl**" that goes to "individuals.nl". That way, "individuals.nl" will get all ranking value, while "pros.nl" will still be reachable through direct access. However, "individuals.nl" will then replace "pros.nl" from SERP in the long-term. The only thing I want is to keep "pros.nl" visible for its own brand queries -> it won't be possible through organic search result, so, I'm just gonna buy those "pros" queries through paid search ! Put links on all pages of pros.nl to individuals.nl (but not the other way around), so that "pros.nl" will pass some ranking value to "individuals.nl" (but only a small part of the ranking value -> ideally, I would like to pass all link value to this domain). Could someone advise me ??? (I know it sound a bit complicated... but I don't have much choice ^^)
Technical SEO | | Netsociety0 -
Amp version of website
Hello & thanks for reading its maybe the monday morning blues but i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/ the pages on the amp version have canonicals pointing to the "normal" website Should the links on "www.example.com/AMP/" point to the amp website or the normal website? what are your thougths?
Technical SEO | | livingphilosophy0 -
Minimising the effects of duplicate content
Hello, We realised that one of our clients, copied a large part of content from our website to his. The normal reaction would be to send a cease and desist letter. Nevertheless this would probably mean loosing a good client. The client dumped the text of several articles (for example:
Technical SEO | | Lvet
http://www.velascolawyers.com/en/property-law/136-the-ley-de-costas-coastal-law.html ) Into the same page:
http://www.freundlinger-partners.com/en/home/faqs-property-law/ I convinced the client to place our authorship tags on this page, but I am wondering if this is enough. What do you think? Cheers
Luca0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Best Way To Handle Expired Content
Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
Technical SEO | | Matthew_Edgar
Matthew1 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0