What is the difference between the two rewrite rules in htaccess?
-
Force www. prefix in URLs and redirect non-www to www
RewriteCond %{HTTP_HOST} !^www.domain.com.ph
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L]Force www. prefix in URLs and redirect non-www to www - 2nd option
RewriteCond %{HTTP_HOST} ^domain.com.ph [NC]
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L] -
Thanks Peter!
-
Yes, the result is identical.
Peter
-
So in theory they are identical?
Just different ways of achieving the same result
-
Hi, as mentioned in the other answer I gave here: http://moz.com/community/q/where-is-the-rule-here-that-force-www-in-urls#reply_202351
the first checks for non-inclusion of the www in the URL (the !^www checks if www is not included at the start of the URL being tested), the second checks for a URL that starts with just the domain.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
What is the difference between using .htaccess file and httpd.conf in implementing thousands of 301 redirections?
What is the best solution in terms of website loading time or server load? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Are multiple domains spammy if they're similar but different
A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as: Johnsmithlandclearing.com Johnsmithdirtwork.com Johnsmithdemolition.com Johnsmithtimercompany.com Johnsmithhydroseeding.com johnsmithtreeservice.com Each business is unique enough and will cross-link to the other. My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen? I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.
White Hat / Black Hat SEO | | SteveMauldin0 -
Cross-Site Links with different Country Code Domains
I have a question with the penguin update. I know they are really cracking down on "spam" links. I know that they are wanting you to shift from linking keywords to the brand name, unless it makes sense in a sentence. We have five sites for one company in the header they have little flag images, that link to different country domains. These domains all have relatively the same domain name besides the country code. My question is, linking these sites back and fourth to each other in this way, does it hurt you in penguin? I know they are wanting you to push your identity but does this cross-site scheme hurt you? In the header of these sites we have something like this. I am assuming the best strategy would probably be to treat them like separate entities. Or, just focus on one domain. They also have some sites that have links in the footer but they are set up like:
White Hat / Black Hat SEO | | AlliedComputer
For product visit Domain.com Should nofollows be added on these footer links as well? I am not sure if penguin finds them spammy too.0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
What is the difference between advertizing and a paid link?
I have been told that google frowns on paid links yet I see many site charging for advertizing and the advertizing consists of an anchor text link. What is the difference between a paid link and this type of advertizing?
White Hat / Black Hat SEO | | casper4340 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0