What is the difference between the two rewrite rules in htaccess?
-
Force www. prefix in URLs and redirect non-www to www
RewriteCond %{HTTP_HOST} !^www.domain.com.ph
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L]Force www. prefix in URLs and redirect non-www to www - 2nd option
RewriteCond %{HTTP_HOST} ^domain.com.ph [NC]
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L] -
Thanks Peter!
-
Yes, the result is identical.
Peter
-
So in theory they are identical?
Just different ways of achieving the same result
-
Hi, as mentioned in the other answer I gave here: http://moz.com/community/q/where-is-the-rule-here-that-force-www-in-urls#reply_202351
the first checks for non-inclusion of the www in the URL (the !^www checks if www is not included at the start of the URL being tested), the second checks for a URL that starts with just the domain.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different site behind the flag
Hello, I am in a very complicated situation. I have a site in Itaian which is targeted in Italy by webmaster tools so the majority of the organic traffic comes from there and everything is fine. However this site got a link from a major international site. So now I get traffic from all over the world but I can't take advantage of it. From the Italian traffic I get from this site I see high pageviews numbers and many minutes in average visitor time. The problem in this situation is that for many reasons this website cannot be translated so I can put many language choices in this site. I want to ask, If I put, let's say an English flag in top of my site, that will indicate the English language, but instead of the user to see an English version of the site he/she will be redirected(no follow link) to another site of the same content in English, will this violate any of Google's guideline or hurt the seo of the original site? Thank you all!
White Hat / Black Hat SEO | | Tz_Seo0 -
Two sites, heavily cross linking, targeting the same keyword - is this a battle worth fighting?
Hi Mozzers, Would appreciate your input on this, as many people have differing views on this when asked... We manage 2 websites for the same company (very different domains) - both sites are targeting the same primary keyword phrase, however, the user journey should incorporate both websites, and therefore the sites are very heavily cross linked - so we can easily pass a user from one site to another. Whilst site 1 is performing well for the target keyword phrase, site 2 isn't. Site 1 is always around 2 to 3 rank, however we've only seen site 2 reach the top of page 2 in SERPs at best, despite a great deal of white hat optimisation, and is now on the decline. There's also a trend (all be it minimal) of when site 1 improves in rank, site 2 drops. Because the 2 sites are so heavily inter-linked could Google be treating them as one site, and therefore dropping site 2 in the SERPs, as it is in Google's interests to show different, relevant sites?
White Hat / Black Hat SEO | | A_Q0 -
Competitor owns two domains which are essentially duplicates. Is this allowed?
Hello everyone,One of my competitors has two E-commerce sites that are almost exactly the same. The company re-branded a few years ago (changed the company name, changed the domain name) but kept the first domain live which is still fairly successful. Their re-branded website is a Top 1000 retailer.The thing is, both websites are essentially the EXACT SAME. They have the same products (with the same item #'s), the same pricing, the same copy and product descriptions, the same contact info, same layout, etc. The internal search bar on the first domain even redirects to their current site! The only real difference are the brand names. Currently, both sites are ranking very well for some very competitive keywords. For the past two years, I kept waiting for Google to penalize one (or both) of them for duplication. But for some reason Google seems to have not noticed. **Is there any way to "show google" site duplication they might be missing?**Thanks!
White Hat / Black Hat SEO | | bpharris90141 -
Are multiple domains spammy if they're similar but different
A client currently has a domain of johnsmith.com (not actual site name, of course). I’m considering splitting this site into multiple domains, which will include brand name plus keyword, such as: Johnsmithlandclearing.com Johnsmithdirtwork.com Johnsmithdemolition.com Johnsmithtimercompany.com Johnsmithhydroseeding.com johnsmithtreeservice.com Each business is unique enough and will cross-link to the other. My questions are: 1) will Google consider cross-linking spammy? 2) what happens to johnsmith.com? Should it redirect to new site with the largest market share, or should it become an umbrella for all? 3) Any pitfalls foreseen? I've done a fair amount of due diligence and feel these separate domains are legit, but am paranoid that Google will not see it that way, or may change direction in the future.
White Hat / Black Hat SEO | | SteveMauldin0 -
Will two numbers on a local listing affect me?
The reason I'm asking this question is. I was on the phone with a Google Rep yesterday for one of my google places. It was in reference to map maker and the fact that I only wanted one number on the listing. About a month ago I had it, so I deleted the listing's local number and than had an 877 number. The problem is when I checked on Friday the Local number was then added back by a Google moderator in Map Maker. So, now there's two numbers on the listing. He told me that would not affect my NAP info, which I don't believe.
White Hat / Black Hat SEO | | PeterRota
He also went on to say that Google goes through listings and if they have an 800 number they may delete it and replace place it with a local number. Has anyone delt with this and can Verify what he says to be true? Additionally, will my NAP be affected if this is the case? Thanks.0 -
Does Google+ make a huge difference?
I run a website that's been ranked well for good keywords related to our business for some time. It was founded back in 2007 and has been there a while. Recently a new site has popped up that ranks brilliantly for everything. It's a new site, and the only redeeming factor I can see is that it has an AddThis box showing the Facebook Likes and Google Plus Ones, and they are around 400 Facebook Likes and 80 Google+ (for every page that ranks). Any other pages on their site which doesn't have any Facebook likes or Google Plus Ones, they don't rank. Our site doesn't have any likes or pluses. Is this making the difference? I stress that other than this our sites are very similar, other than the fact we've been around over 5 years.
White Hat / Black Hat SEO | | freebetinfo0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0