Multiple doamin with same content?
-
I have multiple websites with same content such as
http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org.
Is that enough to keep away my exampl.org site from indexing on google and other search engines?
the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages?
i would welcome good seo practices regarding maintaining multiple domains
thanks and regards
-
You want your redirect rules on the server, not client site. In Apache you can do this with mod_write and the .htaccess file like so.
To add the www
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]To remove the www:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]In IIS they have a rewrite command too. I've not used it myself but this should help: http://www.petermoss.com/post/How-to-redirect-non-www-domain-to-www-domain-requests-in-IIS-7.aspx
-
Anyway I am lucky being with a group of average team mates.
-
I would recommend either asking a new Q&A as to how an IIS redirect works, or checking Google. I lack experience working in that environment. I was spoiled by working with a very talented team who had performed all those changes and I never needed to learn any aspects of IIS.
-
Hi,
Thanks. As you said i was checking server side redirection. My site server is II7. I tried to make a server side rediretion but couldn't. I found a java script redirection and created a redirection. See the page; http://www.infoniagara.com/d-bed-roses.html
I think this too is not a correct redirection, is it?
thanks
-
Glad to be of help. You are always free to reach out here at the SEOmoz Q&A. If you feel a need to reach me specifically my contact information is in my user profile.
-
Ohh thanks so much Ryan. Let me learn the server side redirection and other aspects related to it.
Thanks once again for your consideration and time. I hope I can approach you in future too.
best regards
-
The javascript code you shared is not a proper redirect.
A proper redirect happens on the server. Instead of loading the original target page, the server instead will load the redirect page instantly along with a header response code of 301 which tells search engines the content has moved to a new URL.
If you use javascript in the manner you shared, the original page will load with a 200 "all ok" header code, and then 3 seconds later the javascript will trigger and load the new page with a 200 header code. All the backlinks will still be applied to the original page and not the redirected page.
The exact method of performing a redirect varies based on your server setup. If you have a LAMP server with cPanel, there is a Redirect tool which you can use.
-
hi,
It's really helpful and now I understand it.
I know that you are one of the true masters of SEO and I think you can clarify one more doubt. It is also regarding the issue of redirection. I want to know that a script i use for redirectining old pages to newpages is right or not?
This script is working properly and I want to know that what type of redirection is this and is it a proper redirection to get the backlink juice? (I add this script to the body just after the
header.)
thanks in advance
-
A 301 redirect is the proper solution and superior to the canonical. It is fine to have the canonical too, but add the redirect.
When you ask "should I do it for each page", understand a single redirect can forward all non-www traffic on your site to it's www equivalent. If you are unsure how to perform the redirect, simply ask your host. Most sites are on managed hosting and it is a very common and easy request.
-
hi dear Ryan,
thanks so much for your time and valuable suggestions. As you said, i am aware of this problem and therefore i added a canonical url to the homepage. should i make a 301 from non-www to www url? should i do it for each page?
thanks & regards
-
In short, you should not use duplicate content across various domains. Doing such will likely negatively affect rankings for either site. Try doing a search that would naturally return the results for a duplicated web page. You will likely find one page ranks well, while the other page ranks significantly lower due to the duplication.
I checked your .org site and it's pages are properly 301 redirected to the .com site. This change would cause any valid pages listed on the .org site to disappear from Google's index. It may take a month from the date the 301 was implemented for Google to crawl and update the entire site.
One point I would add is I suggest you perform a Google site:http://www.infoniagara.org search. Notice you do have a lot of search results for the .org site still present. Those pages are properly redirected to the .org site but they receive 404 errors. If the pages are really gone and there is no equivalent, that is fine and these results should disappear from Google's index over time. If there are similar pages on your site, you should 301 redirect these pages to them.
Another issue, your .com site appears in both the www and non-www form. If you take a URL, remove the "www" then the page appears normally showing the non-www URL. This is a problem which needs to be fixed as it is dividing your backlink juice. Pick one version of your URL, the www or non-www version, and 301 the other version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple redirects for GA tracking
We recently replaced a high traffic online service with a new one that now resides at a new URL. We redirect the old site (https://subdomain.mysite.org) to a static page announcing the change (http://www.mysite.org/announcement.html) that links out to the new online service. The SSL cert on the old site is valid for two more months and then would cost $1K to renew. We'd like to measure traffic from the old link over the next two months to see if it's worth renewing the SSL cert to keep a redirect going. If I go into GA, filter the "announcement.html" page and set the secondary dimension to "referral path" I'm not seeing any traffic from https://subdomain.mysite.org. Guessing this is part of the "(not set)" group. First thought was to have that go to a unique intermediary page to log the referral, which then redirects out to the announcement page. Is this considered spammy or is there another way to track referrals from the https site that I'm not considering? Thanks.
White Hat / Black Hat SEO | | c2g0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Does Google Penalize for Managing multiple Google Places from the same IP Address? Can you manage from same google account or separate? Or does it matter since it's created from the same IP?
I manage a number of client's Google Places from the same IP and heard this is not a good thing. Are there Do's and Don'ts when managing multiple Google Places? Create separate google accounts for each or can you use the same account?
White Hat / Black Hat SEO | | Souk0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0