Can multiple domains compete with one another if they have the same/similar content?
-
Could an ecommerce site with a .co.nz nd .com.au domain compete with one another and hard organic rankings if the content of the pages are the same? All the links would be identical (apart from .co.nz or .com.au) the product descriptions, pages titles etc... would all be identical or similar (our page titles are ever so slightly different). Could this be hurting us?
Thanks in advance ^ Paul
-
Thanks for the advice.
-
If you are using the appropriate rel="alternate" hreflang="x" you shouldn't have any problem. For more info you can read here: https://support.google.com/webmasters/answer/189077?hl=en
Most likely, as the pages are the same, they won't show up at the same time, Google will serve the one that they consider appropriate for the user, hence the use of hreflang, to guide Google to know which version they should serve to each user.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do search engines see copy/keywords when it appears only at the bottom of a page?
My client is looking to improve their SEO, and to date I've written meta data and made some initial recommendations. Thing is, on some of their pages, the body copy appears at the bottom of the page, past links and big, splashy images. My question is, will search engines even see that copy to crawl it for keywords? Thanks!
Web Design | | MarcieHill0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Can Image File Names be Masked?
If we "mask" file names for our website but they are left their original name on the server, will Google notice this? Client wants to mask them in order to name them with keywords but not change on the actual server.
Web Design | | Atlanta-SMO0 -
How Can I Make My Site iPhone Friendly?
I have been looking into making my website for iphone friendly as my analytics are not great for the iphone and I know when I try to navigate around it on an iphone it can be tough. I was told that if I make changes to the layout that it would affect my layout across everything, which I did not want to do. So I have two questions: Is this correct regarding the layout? If so, if you did something like m.waikoloavacationrentals.com which would be the mobile version how would that possibly effect your rankings with regards to the traffic distribution? Any feedback would be appreciated. Also if anyone has any experience in doing this I would be interested in discussing further.
Web Design | | RobDalton0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
Does doubling up on domains increase my ranking?
I own two websites. One is older and contains the bulk of my content. The other is a web-based tool that has less written content, but is equally important to my business. For the sake of examples we'll call the older website "oldsite.com" and new website "newsite.com". Would it benefit my old site to direct traffic to the web-based tool using a domain like "newsite.oldsite.com"? What's the best way to integrate the two sites so I am not splitting my traffic?
Web Design | | Travis-W0 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0