How to force www. prefix in all URLs using htaccess?
-
We're using an Tomcat Apache server. Thanks in advance!
-
Thanks Peter!
-
Hi
Please see my answer to this in your other Q&A forum question:
http://moz.com/community/q/where-is-the-rule-here-that-force-www-in-urls
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
301 domain name URL variants for canonicalization question in htaccess?
#1 RewriteCond %{HTTP_HOST} ^xyz.com [NC] RewriteRule ^(.*)$ http://www.xyz.com/$1 [L,R=301] What I want to do here is to redirect URLs that have omitted the “www.” prefix to the full “www.xyz.com” home page URL. That means the home page URL http://xyz.com will not resolve on its own, but instead will redirect to http://www.xyz.com (without trailing slash). #2 RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+/)*(default|index).(html|php|htm)\ HTTP/ [NC] RewriteRule ^(([^/]+/)*)(default|main|index).(html|php|htm)$ http://www.xyz.com/$1 [L,R=301] What I want to do here is to ensure that any home page URL that includes several versions of explicit page name references, such as default.htm or index.html, will be redirected to the canonical home page URL, http://www.xyz.com (without trailing slash). Are the rewrite rules correct? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
How to remove trailing slashes in URLs using .htaccess (Apache)?
I want my URLs to look like these: http://www.domain.com/buy http://www.domain.com/buy/shoes http://www.domain.com/buy/shoes/red Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Suggestions in redirecting an old URL to new URL with parenthesis () ?
What should I use in .htaccess if I will redirect an old URL with parentheses to a new URL like below? RedirectMatch 301 http://www.olddomain.com/buy/nike-shoes/kobe(7)/red http://www.newdomain.com/buy/nike-shoes/kobe(7)/red Or RedirectMatch 301 http://www.olddomain.com/buy/nike-shoes/kobe(7)/red http://www.newdomain.com/buy/nike-shoes/kobe(7)/red
White Hat / Black Hat SEO | | esiow20130 -
Best use of domains with keywords
I own a domain with just the company name in it (no keywords) that I use as main domain. I also own some other domain with keywords inside that right now I redirect all to the main domain with a 301 redirect. What is the best use for these domains? Should I use them when I do link building or is better to use just the main domain? Can they be useful to increase the main domain link juice/page rank? If yes, how? Thanks
White Hat / Black Hat SEO | | darkanweb0 -
Using Redirects To Avoid Penalties
A quick question, born out of frustration! If a webpage has been penalised for unnatural links, what would be the effects of moving that page to a new URL and setting up a 301 redirect from the old penalised page to the new page? Will Google treat the new page as ‘non-penalised’ and restore your rankings? It really shouldn’t work, but I’m convinced (although not certain) that our clients competitor has done this, with great effect! I suppose you could also achieve this using canonicalisation too! Many thanks in advance, Lee.
White Hat / Black Hat SEO | | Webpresence0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0