Duplicate content - multiple sites hosted on same server with same IP address
-
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case?
Many thanks!
-
Its early for me and I haven't had enough coffee yet so I may be misreading things. Why do you have three sites on the same IP address with three separate mirrored WWW versions?
As far as I can see... No reason you can't use Rel=Canonical to fix that issue though. If the code is going to be the same because of how they are mirrored then you'll have one site canonical'd to the right version and the right one canonical'd to itself... which shouldn't cause any issues. Can't guarantee that it will work in every instance though. Canonicals are a suggestion not a directive. So the bots will try to respect your tag but if they feel it is incorrect, improper, deceptive, etc. they can always choose not to.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Will including a global-site link in all 100 local-sites footer be considered spammy?
If I am a car manufacturer brand site(global), and I request all my location-specific domains include a link to the global site in their footers, would this trigger a red flag for Google? There are roughly 100 location-specific sites, but I would like to come up with a long term solution, so this number could be larger in the future. Is it best practice to only follow the footer link on each location-specific site Homepage, and nofollow the rest of the footer links on each site? Is it best to only include one followed link to the manufacturer brand site (global) on each location-specific domain? Is it best to not put this global link in the footer, but rather towards the top of the page only on the homepage?
White Hat / Black Hat SEO | | Jonathan.Smith0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Keyword Duplication in the title
Hello, I read on this great SEO Blueprint Article here that you don't want to duplicate any words in the title tag, even one duplicate. But what if your branding and keywords both have the same word in it. For example, making the title here like this: NLP Training and Certification Center | NLP and Coaching Institute which is 66 characters by the way. Your thoughts on the duplicate word "NLP"?
White Hat / Black Hat SEO | | BobGW0 -
Is this a duplicated content?
I have an e-commerce website and a separated blog hosted on different domains. I post an article on my blog domain weekly. And I copy the 1st paragraph (sometimes only part of it when it's too long) of the article to my home page and a sub-catalog page. And then append it by anchor text "...more" which linked to the article. 1. Is that digest (1st paragraph) on my e-commerce site deemed as duplicated content by Google? Any suggestion? 2. In the future if I move the blog under the e-commerce website would it make any different with regards to this issue? Thanks for your help!
White Hat / Black Hat SEO | | LauraHT0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0