301 Redirect using rewrite rule in .htaccess
-
Hi guys,
I have these types of URLs with the format below that are seen as duplicate contents
http://www.mysite.com/index.php?a=11&b=15&d=3&c=1
I wanted to permanently redirect them to my homepage. I am thinking if this is possible in .htaccess using rewrite conditions?
Thanks in advance...
-
This is a solutions, but its an ugly one, does anyone really wants a home url of http://www.mysite.com/index.php?a=11&b=16&c=5&d=1&page=2. you then have the problem of people linking to that page.
I believe michael said in a previous post that they were prodused by his CMS, the best idea would be to get rid of them rather then deal wioth them if posible.
-
From memory, I believe Michael has these urls produced by his CMS and are unnecessary, i could be getting him mixed up with someone else.
also doing this in Google does not help other search engines, you would need to do it in all search engines for all possible combinations for each pages, this can become un-manageable. -
I have to say I agree with Sha on this one.
If you are not confident in using .htaccess then I wouldn't bother. I think there is a much easier solution:
1- As Sha said, use webmaster tools to tell Google how to handle these parameters, this should slowly start to take them out of the index.
2- Add rel=canonical to all your pages, this way even if parameters are added, the rel=canonical will always point back to the original and remove any risk of duplicate content.
I hope this helps.
Craig
-
Hi Michael,
You do not need to make any changes to your .htaccess file. Actually, if you 301 these URLs you will break your search so that it no longer works.
The solution I would use is to go into Google Webmaster Tools and tell Googlebot to ignore the parameters you are concerned about.
In your code, the ? says "here come some parameters" and the & separates those parameters. So, in the case you have quoted, the parameters are a, b, c, d.
Be aware of course, that Roger will still see these URLs as duplicates since he doesn't know about your private conversations with Google This means that they will still appear in your SEOmoz report, but as long as you make a note of them so you know they can be ignored that shouldn't be a problem.
Hope that helps,
Sha
-
I disagree more with the level of apprehension, rather than the premise itself. Anyhow I’m off to bed.
-
Alan, we will just have to disagree on this topic.
I too have studied Computer Science in college. I too have a wall filled with MS certifications. I too have been programming since before the internet and even before hard drives existed. I am only 40 but the first PC I used was an Atari 800 and the command to save my work was "csave" which stood for "cassette save". This was before even floppy disks were popular and data was saved to cassette tapes.
I certainly am not forbidding anyone from taking whatever action they deem fit. It is indeed up to Michael or any reader to assess what changes they are comfortable making for their site.
The point I am making is many people grow very comfortable in making changes to their website, especially SEO-related changes. It is relatively safe to do such. If you make a mistake, your site may not rank as well, may not load as fast, may not appear correctly in all browsers and so forth. The consequences are relatively low.
Making changes in an htaccess file is a completely different ballgame. One character out of place and your site can instantly be taken off line. If that happened, it's actually not so bad compared to other problems which can be created. A character out of place can disable your site security and the person making the change would likely not realize the problem until their site was hacked. A character out of place can cause other functionality of your site to not work correctly. It can also cause the fix being implemented to work in some but not all instances.
I highly encourage users to make most changes to their sites according to their comfort level. Htaccess modifications is a clear exception. A user can easily be mislead to believe their site is working fine only to later realize there is a major problem with the site. There are countless instances where a site was exploited due to a vulnerability in the htaccess file. I therefore strongly recommend for users never to touch their htaccess file unless they are extremely confident in the changes they are making. Many websites will offer code snippets which can provide users a false sense of security and lead them to experiment. It is a bad idea to do such with the htaccess file.
-
I have
been programming since before the internet came to be, I have studied Computer
Science at University and passed numerous Microsoft Certifications, and while I
would not discount study, it is my experience that I have never met a great
programmer that did not learn by trail and error, after all this is how you
become experienced. There is no danger in using a backup, RegEx does not work
sometimes and not others, it is not dynamic, it is a static peace of code. You
will not excel at SEO unless you learn these things. I am sure Michael is capable
of deciding if he wants to do it himself, he seems to have got a long way already.
It would seem to me he is learning quite quickly. You may suggest that you
would not try, but I don’t think it is correct to forbid others. -
I'm sorry but the idea of advising users without expertise to modify their htaccess file is completely reckless. The trial & error approach can easily lead to circumstances where the rule works some of the time but not always. Worse, it can negatively impact other rules and site security causing major problems.
Without knowing the details of the site involved, I tend to make the safe assumption the site is important and there are one or more people who's livelihoods depend on the site. Having worked with clients who have recovered from the damage caused by errors in htaccess files I will firmly share my experience that no one other then a qualified expert should ever touch the file. The potential for damage is very high.
-
All he needs to do is keep a back up, and he can have as many tries as he wants. He simpley has to replace the file with his back up if he goes wrong.
There is little danger here. -
htaccess rewrite rules are based on Regex expressions. Your current Regex rewrite rules can be modified to adjust for the specific URLs. You need to locate an experienced programmer to write the expressions for you.
-
Hi Mchael.
Yes, you can use htaccess to rewrite or redirect the URL.
Where do these URLs presently lead to? If these URLs are duplicates for pages on your site, I would suggest using a 301 redirect to send the traffic to the proper URL rather then your home page.
If your server uses cPanel, there is a Redirect tool you can use. This tool makes the process of adding a redirect easier and safer then modifying your htaccess file. Your htaccess file controls various aspects of your site's security, accessibility and SEO. The slightest error can cause your site to instantly be inaccessible. I would not recommend making any changes to your htaccess file except by an experienced programmer. Even using the correct code in the wrong order can lead to problems.
-
I work on microsoft servers, i dont use .htaccess
but this is the rule i woudl write to fix all urls stating with index.php, no mater what the querystreing
<rule name="DefaultRule" stopprocessing="true"><match url="^index.php"><action type="Redirect" url="/" appendquerystring="false"></action></match></rule>
but try this let me know if it works i have a few other ideas
RewriteRule ^/index.php / [R=301,L]
-
Hi Alan,
I think it's now clear to me that they should be rewritten. Thanks for pointing me to the right direction.
I have a classified site and in my .htaccess I have these rewrite rules by default
RewriteRule ^/?(new)/(1_day)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=4 [L] ##category newest 1day
RewriteRule ^/?(new)/(1_week)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=1 [L] ##category newest 1week
RewriteRule ^/?(new)/(2_weeks)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=2 [L] ##category newest 2weeks
RewriteRule ^/?(new)/(3_weeks)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=3 [L] ##category newest 3weeksRewriteRule ^/?(new)/(1_day)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=4&page=$5 [L] ##category newest 1day pages
RewriteRule ^/?(new)/(1_week)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=1&page=$5 [L] ##category newest 1week pages
RewriteRule ^/?(new)/(2_weeks)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=2&page=$5 [L] ##category newest 2weeks pages
RewriteRule ^/?(new)/(3_weeks)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=3&page=$5 [L] ##category newest 3weeks pagesunfortunately, these rules could not handle all URLs of the same format with different variables like the following below
http://www.mysite.com/index.php?a=11&b=15&d=3&c=1
http://www.mysite.com/index.php?a=11&b=15&d=3&c=2
http://www.mysite.com/index.php?a=11&b=16&c=5&d=1
http://www.mysite.com/index.php?a=11&b=16&c=5&d=1&page=2
http://www.mysite.com/index.php?a=11&c=5&d=1&b=230
Any idea on how I can solve this problem to avoid duplicate content?
Thanks in advance...
-
rewrite and redirect are not the same thing. you want to 301 them, but better still why do you have them?
Do you have a wordpress site?If these errores were found by a crawler it means that you have the links on your site somewhere. the best thing to do is correct the links. 301's leak link juice you want to limit their number.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection Problem
I have a site that has 2,50,000 pages and I want to redirect to another domain. Is it good practice for SEO and google?
Intermediate & Advanced SEO | | MuhammadQasimAttari0 -
Can anyone please explain the real difference between backlinks, 301 links, and redirect links?which one is better to rank a website? i am looking for the help for one of my website
Can anyone please explain the real difference between backlinks, 301 links, and redirect links? which one is better to rank a website? I am looking for help for one of my website vacuum cleaners
Intermediate & Advanced SEO | | hshajjajsjsj3880 -
What to keep in mind: to 301 redirect every page in an entire online store
Hello, I've got to put a 301 redirect on every page in an entire online store. We're moving to a better premade cart. Who has experience with this? How do I not lose traffic, if that is possible? What do I need to keep in mind? Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Tough 301 redirect with a /www in it
Hi Mozzers, I'm using Eggplants 301 redirect via wordpress and for some reason I can't redirect one url. The example of it is below: www.website.com/news/www.website.com As you can see, it looks like there's 2 url's and this plugin doesn't do the trick. Does anyone have any suggestions? Maybe via .htaccess? Thank you!
Intermediate & Advanced SEO | | Shawn1240 -
Is it worth redirecting?
Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?
Intermediate & Advanced SEO | | dieselprogrammers0 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Can there be to many 301 redirects
Is it possible to have to many 301 redirects. I am currently looking at 156 of them. Does this create any quality issues with regard to site performance or any other issues. Thank you for your consideration!
Intermediate & Advanced SEO | | APICDA0 -
Redirecting, then redirecting back
Hey, mozzers! My first question ever... I have a client who has (fictitionally) WickerPatioHomeStore.com, which features wicker home decor. Not too long ago, they wanted a shorter, easier URL, so they redirected to another domain they own, WickerPatio.com (again, fictional). They saw somewhat of a drop in traffic, and wonder if there's a correlation with the words "home store" not being in their domain any more. When considering the two, I figure that relevant factors would be age of domains, history of content of the domains, and inbound links to each domain. Any thoughts on other things to consider? Thanks very much!! ~ Scott
Intermediate & Advanced SEO | | GRIP-SEO0