Multiple doamin with same content?
-
I have multiple websites with same content such as
http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org.
Is that enough to keep away my exampl.org site from indexing on google and other search engines?
the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages?
i would welcome good seo practices regarding maintaining multiple domains
thanks and regards
-
You want your redirect rules on the server, not client site. In Apache you can do this with mod_write and the .htaccess file like so.
To add the www
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]To remove the www:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]In IIS they have a rewrite command too. I've not used it myself but this should help: http://www.petermoss.com/post/How-to-redirect-non-www-domain-to-www-domain-requests-in-IIS-7.aspx
-
Anyway I am lucky being with a group of average team mates.
-
I would recommend either asking a new Q&A as to how an IIS redirect works, or checking Google. I lack experience working in that environment. I was spoiled by working with a very talented team who had performed all those changes and I never needed to learn any aspects of IIS.
-
Hi,
Thanks. As you said i was checking server side redirection. My site server is II7. I tried to make a server side rediretion but couldn't. I found a java script redirection and created a redirection. See the page; http://www.infoniagara.com/d-bed-roses.html
I think this too is not a correct redirection, is it?
thanks
-
Glad to be of help. You are always free to reach out here at the SEOmoz Q&A. If you feel a need to reach me specifically my contact information is in my user profile.
-
Ohh thanks so much Ryan. Let me learn the server side redirection and other aspects related to it.
Thanks once again for your consideration and time. I hope I can approach you in future too.
best regards
-
The javascript code you shared is not a proper redirect.
A proper redirect happens on the server. Instead of loading the original target page, the server instead will load the redirect page instantly along with a header response code of 301 which tells search engines the content has moved to a new URL.
If you use javascript in the manner you shared, the original page will load with a 200 "all ok" header code, and then 3 seconds later the javascript will trigger and load the new page with a 200 header code. All the backlinks will still be applied to the original page and not the redirected page.
The exact method of performing a redirect varies based on your server setup. If you have a LAMP server with cPanel, there is a Redirect tool which you can use.
-
hi,
It's really helpful and now I understand it.
I know that you are one of the true masters of SEO and I think you can clarify one more doubt. It is also regarding the issue of redirection. I want to know that a script i use for redirectining old pages to newpages is right or not?
This script is working properly and I want to know that what type of redirection is this and is it a proper redirection to get the backlink juice? (I add this script to the body just after the
header.)
thanks in advance
-
A 301 redirect is the proper solution and superior to the canonical. It is fine to have the canonical too, but add the redirect.
When you ask "should I do it for each page", understand a single redirect can forward all non-www traffic on your site to it's www equivalent. If you are unsure how to perform the redirect, simply ask your host. Most sites are on managed hosting and it is a very common and easy request.
-
hi dear Ryan,
thanks so much for your time and valuable suggestions. As you said, i am aware of this problem and therefore i added a canonical url to the homepage. should i make a 301 from non-www to www url? should i do it for each page?
thanks & regards
-
In short, you should not use duplicate content across various domains. Doing such will likely negatively affect rankings for either site. Try doing a search that would naturally return the results for a duplicated web page. You will likely find one page ranks well, while the other page ranks significantly lower due to the duplication.
I checked your .org site and it's pages are properly 301 redirected to the .com site. This change would cause any valid pages listed on the .org site to disappear from Google's index. It may take a month from the date the 301 was implemented for Google to crawl and update the entire site.
One point I would add is I suggest you perform a Google site:http://www.infoniagara.org search. Notice you do have a lot of search results for the .org site still present. Those pages are properly redirected to the .org site but they receive 404 errors. If the pages are really gone and there is no equivalent, that is fine and these results should disappear from Google's index over time. If there are similar pages on your site, you should 301 redirect these pages to them.
Another issue, your .com site appears in both the www and non-www form. If you take a URL, remove the "www" then the page appears normally showing the non-www URL. This is a problem which needs to be fixed as it is dividing your backlink juice. Pick one version of your URL, the www or non-www version, and 301 the other version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Will pillar posts create a duplication content issue, if we un-gate ebook/guides and use exact copy from blogs?
Hi there! With the rise of pillar posts, I have a question on the duplicate content issue it may present. If we are un-gating ebook/guides and using (at times) exact copy from our blog posts, will this harm our SEO efforts? This would go against the goal of our post and is mission-critical to understand before we implement pillar posts for our clients.
White Hat / Black Hat SEO | | Olivia9540 -
Duplicate content site not penalized
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
White Hat / Black Hat SEO | | KenSchaefer0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
DIV Attribute containing full DIV content
Hi all I recently watched the latest Mozinar called "Making Your Site Audits More Actionable". It was presented by the guys at seogadget. In the mozinar one of the guys said he loves the website www.sportsbikeshop.co.uk and that they have spent a lot of money on it from an SEO point of view (presumably with seogadget) so I decided to look through the source and noticed something I had not seen before and wondered if anyone can shed any light. On this page (http://www.sportsbikeshop.co.uk/motorcycle_parts/content_cat/852/(2;product_rating;DESC;0-0;all;92)/page_1/max_20) there is a paragraph of text that begins with 'The ever reliable UK weather...' and when you via the source of the containing DIV you will notice a bespoke attribute called "threedots=" and within it, is the entire text content for that DIV. Any thoughts as to why they would put that there? I can't see any reason as to why this would benefit a site in any shape or form. Its invalid markup for one. Am I missing a trick..? Thoughts would be greatly appreciated. Kris P.S. for those who can't be bothered to visit the site, here is a smaller version of what they have done: This is an introductory paragraph of text for this page.
White Hat / Black Hat SEO | | yousayjump0 -
XML feeds in regards to Duplicate Content
Hi everyone I hope you can help. I run a property portal in Spain and am looking for an answer to an issue we are having. We are in the process of uploading an XML feed to our site which contains 10,000+ properties relating to our niche. Although this is great for our customers I am aware this content is going to be duplicated from other sites as our clients advertise over a range of portals. My question is, are there any measures I can take to safeguard our site from penalisation from Google? Manually writing up 10,000 + descriptions for properties is out of the question sadly. I really hope somebody can help Thanks Steve
White Hat / Black Hat SEO | | buysellrentspain0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0