What is the difference between the two rewrite rules in htaccess?
-
Force www. prefix in URLs and redirect non-www to www
RewriteCond %{HTTP_HOST} !^www.domain.com.ph
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L]Force www. prefix in URLs and redirect non-www to www - 2nd option
RewriteCond %{HTTP_HOST} ^domain.com.ph [NC]
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L] -
Thanks Peter!
-
Yes, the result is identical.
Peter
-
So in theory they are identical?
Just different ways of achieving the same result
-
Hi, as mentioned in the other answer I gave here: http://moz.com/community/q/where-is-the-rule-here-that-force-www-in-urls#reply_202351
the first checks for non-inclusion of the www in the URL (the !^www checks if www is not included at the start of the URL being tested), the second checks for a URL that starts with just the domain.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why there is lot of difference in Domain Authority vs majestic trust flow strange???
Hello all I want to ask you why there is difference in DA authority vs majestic trust authority as both of these companies say they have the best authority alogrithm see the below link for refrence. http://wp.auburn.edu/bassclub/next-meeting-1-28-2014/
White Hat / Black Hat SEO | | adnan11010 -
Does this URL need rewriting?
Hello, Does this URL need to be rewritten? http://www.nlpca.com/DCweb/modelingwithnlparticleandreas.html Bob
White Hat / Black Hat SEO | | BobGW0 -
Thousands of 301 redirections - .htaccess alternatives?
Hi guys, I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
White Hat / Black Hat SEO | | esiow20130 -
Are multiple domains spammy if they're similar but different
A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as: Johnsmithlandclearing.com Johnsmithdirtwork.com Johnsmithdemolition.com Johnsmithtimercompany.com Johnsmithhydroseeding.com johnsmithtreeservice.com Each business is unique enough and will cross-link to the other. My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen? I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.
White Hat / Black Hat SEO | | SteveMauldin0 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Manage two domains with Webmaster tools
Dear all, I want to create an additional domain in order to: Rank better for a very specific keyword with an exact match domain (I already ask about that, but I did not have my ideas clear at the time); Offer to the user usefull infomation about the topic, without duplicating the content I have in my main domain, just additional and very specific information; Use this domain as landing page, offering a tutorial on "how to" use a specific section of my main domain, including a video tutorial; Link to the related section of my main domain. So, the main idea is, if an user type in google "this specific keyword of ours", they will have in the results "thisspecifickeywordofus.es", they will click and go to the site, where they will find unique and specific information, complementing what I have in my mainsite, and showing how to use my site, so trying to use it for conversion. I want to do only white hat SEO, so first at all, I would like to ask you if you think it is a good idea. The keyword is difficult to rank for, and if I can take advantage of this exact match domain (even if it is nowdays no more so big an advantage), would be great. Second, do you see any problem in managing different domains from the same google account? Newbee question, sorry. Thanks in advance for your help, Daniel
White Hat / Black Hat SEO | | te_c0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0