Best practice for duplicate website content: same root domain name but different extension
-
Hi there
I have a new client who has two websites:
http://www.bayofislandsteambuilding.co.nz
http://www.bayofislandsteambuilding.org.nzThey are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content.
What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain.
Thanks in advance.
Laurie
-
Two URLs containing the same content will be considered as duplicated content, do the domain level redirection so that every URL should redirect to the home page of the preferred domain.
-
Hi - thanks for the response.
Do I still need to set up 301 redirects for every page or can I do it server-side for the entire domain? And is this approach better than using the rel=”canonical” link element in the http header tags?
Thanks
L
-
301 Redirect one to the other. That's it that's all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
SEO best practices for embedding content in a map
My company is working on creating destination guides for families exploring where to go on their next vacation. We've been creating and promoting content on our blog for quite some time in preparation for the map-based discovery. The UX people in my company are pushing for design/functionality similar to:
Intermediate & Advanced SEO | | Vacatia_SEO
http://sf.eater.com/maps/the-38-essential-san-francisco-restaurants-january-2015 From a user perspective, we all love this, but I'm the SEO guy and I'm having a hard time figuring out the best way to guide my team regarding getting readers to the actual blog article from the left content area. The way they want to do it is to have the content displayed overtop the map when someone clicks on a pin. Great, but there's no way for me to optimize the map for every article. After all, if we have an article about best places to snorkel on Maui, I want Google to direct people to the blog article specific to that search term because that page is the authority on that subject. Additionally, the map page itself will have no original content because it will be pulling all the blog content from other URLS, which will get no visitors if people read on the map. We also want people, when they find an article they like, to be able to copy a URL to share. If the article is housed on the map page, the URL will be ugly and long (not SEO friendly) based on parameters from the filters the visitor used to drill down to that article. So I don't think I can simply optimize the map filtered-URL. Can I? The others on my team do not want visitors to ping pong back and forth between map and article and would prefer people stay on the discovery map. We did have a thought that we'd give people an option to click a link to read the article off the map but I doubt people will do it which means that page will never been visited, thus crushing it's page rank. so questions: How can i pass link juice/SEO love from the map page to the actual blog article while keeping the user on the map? Does google pass that juice if you use Iframes? What about doing ajax calls? Anyone have experience doing this? Am I making a mountain out of a molehill? Should I trust that if I create good content, good UX and allow people to explore how they prefer, Google will give me the love? Help me Rand Fishkin, you're my only hope!1 -
Duplicate Content... Really?
Hi all, My site is www.actronics.eu Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!. I know why. Moz classes a page as duplicate if >95% content/code similar. There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model. Here's an example:
Intermediate & Advanced SEO | | seowoody
http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51 Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report). I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others? So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number. One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
(This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php) Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts. Thanks,
Woody 🙂1 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Best practices for a local business move
My client is a chiropractic office that will be moving to the next town over in about 3 months. What are people's best practices on how to best accomplish this SEO-wise so as to not lose too much in terms of rankings, current organic traffic and citation listings?
Intermediate & Advanced SEO | | easystreetint0 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0