How to force www. prefix in all URLs using htaccess?
-
We're using an Tomcat Apache server. Thanks in advance!
-
Thanks Peter!
-
Hi
Please see my answer to this in your other Q&A forum question:
http://moz.com/community/q/where-is-the-rule-here-that-force-www-in-urls
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
What EMD Meta Title should we use and what about getting links to the same C-Block IP?
Situation: Recently I encountered two problems with both internal and external SEO for my company websites.
White Hat / Black Hat SEO | | TT_Vakantiehuizen
This Dutch company has four websites on one server. Three closely related EMD(Exact Match Domain) websites and one overarching website. (Holiday homes rental websites) Vakantiehuizen-Verhuur.nl (overarching)
Vakantiehuizen-Frankrijk.nl (EMD)
Vakantiehuizen-Italie.nl (EMD)
Vakantiehuizen-Spanje.nl (EMD) Question 1:
What would be a preferable Meta Title for the EMD websites (homepage/subpages)? Keep in mind that the domains are EMD. The homepage will target the most important keywords and should not compete with subpages. Options for the homepage:
1. Vakantiehuizen Frankrijk | Alle vakantiehuizen in Frankrijk op een rij!
2. Vakantiehuizen Frankrijk | Vakantiehuizen-Frankrijk.nl onderdeel van Vakantiehuizen-Verhuur.nl
3. Suggestions? Options for the subpages:
1. Vakantiehuis Normandie | Vakantiehuizen Frankrijk
2. Vakantiehuis Normandie | Vakantiehuizen-Frankrijk.nl
3. Suggestions? And concerning the keywords in the beginning; is it wise to use both plural and singular terms in the meta title? For Example:
Hotel New York. Best hotels in New York | Company Name Question 2: Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?0 -
Moving content to a clean URL
Greetings My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it 😞 I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url. Would this be flagged as duplicate content, even if i took the old site down? your help is greatly appreciated Silas
White Hat / Black Hat SEO | | Silasrose0 -
Include placename in URL, or not?
Hi Mozzers, I'm wondering whether to put placename in URL or not. This is for a hotel so it's very focused on the county. I have loads of sub pages along the lines of www.hotelname.com/short-breaks-somerset www.hotelname.com/eat-out-somerset and so on but I was wondering whether that placename element would help or hinder. For example, may want to rank for short breaks in other searches (not just those seeking short breaks in Somerset) and was wondering whether the somerset bit may actually hinder this in the future. Also noticed Somerset is mentioned in nearly all of the page urls through the site. Perhaps this is a bit spammy and just not neccesary. I can include the address of the hotel on every page anyway. What do you think? Thanks in advance for your help 🙂 Luke
White Hat / Black Hat SEO | | McTaggart0 -
Has anyone used tribepro.com
Does that concept really work. Any experience? I've registered and so far I think it's hard to measure whether the shares are spam or genuine. Would love to see it works for someoneThanks
White Hat / Black Hat SEO | | LauraHT0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0