301 Redirect using rewrite rule in .htaccess
-
Hi guys,
I have these types of URLs with the format below that are seen as duplicate contents
http://www.mysite.com/index.php?a=11&b=15&d=3&c=1
I wanted to permanently redirect them to my homepage. I am thinking if this is possible in .htaccess using rewrite conditions?
Thanks in advance...
-
This is a solutions, but its an ugly one, does anyone really wants a home url of http://www.mysite.com/index.php?a=11&b=16&c=5&d=1&page=2. you then have the problem of people linking to that page.
I believe michael said in a previous post that they were prodused by his CMS, the best idea would be to get rid of them rather then deal wioth them if posible.
-
From memory, I believe Michael has these urls produced by his CMS and are unnecessary, i could be getting him mixed up with someone else.
also doing this in Google does not help other search engines, you would need to do it in all search engines for all possible combinations for each pages, this can become un-manageable. -
I have to say I agree with Sha on this one.
If you are not confident in using .htaccess then I wouldn't bother. I think there is a much easier solution:
1- As Sha said, use webmaster tools to tell Google how to handle these parameters, this should slowly start to take them out of the index.
2- Add rel=canonical to all your pages, this way even if parameters are added, the rel=canonical will always point back to the original and remove any risk of duplicate content.
I hope this helps.
Craig
-
Hi Michael,
You do not need to make any changes to your .htaccess file. Actually, if you 301 these URLs you will break your search so that it no longer works.
The solution I would use is to go into Google Webmaster Tools and tell Googlebot to ignore the parameters you are concerned about.
In your code, the ? says "here come some parameters" and the & separates those parameters. So, in the case you have quoted, the parameters are a, b, c, d.
Be aware of course, that Roger will still see these URLs as duplicates since he doesn't know about your private conversations with Google
This means that they will still appear in your SEOmoz report, but as long as you make a note of them so you know they can be ignored that shouldn't be a problem.
Hope that helps,
Sha
-
I disagree more with the level of apprehension, rather than the premise itself. Anyhow I’m off to bed.
-
Alan, we will just have to disagree on this topic.
I too have studied Computer Science in college. I too have a wall filled with MS certifications. I too have been programming since before the internet and even before hard drives existed. I am only 40 but the first PC I used was an Atari 800 and the command to save my work was "csave" which stood for "cassette save". This was before even floppy disks were popular and data was saved to cassette tapes.
I certainly am not forbidding anyone from taking whatever action they deem fit. It is indeed up to Michael or any reader to assess what changes they are comfortable making for their site.
The point I am making is many people grow very comfortable in making changes to their website, especially SEO-related changes. It is relatively safe to do such. If you make a mistake, your site may not rank as well, may not load as fast, may not appear correctly in all browsers and so forth. The consequences are relatively low.
Making changes in an htaccess file is a completely different ballgame. One character out of place and your site can instantly be taken off line. If that happened, it's actually not so bad compared to other problems which can be created. A character out of place can disable your site security and the person making the change would likely not realize the problem until their site was hacked. A character out of place can cause other functionality of your site to not work correctly. It can also cause the fix being implemented to work in some but not all instances.
I highly encourage users to make most changes to their sites according to their comfort level. Htaccess modifications is a clear exception. A user can easily be mislead to believe their site is working fine only to later realize there is a major problem with the site. There are countless instances where a site was exploited due to a vulnerability in the htaccess file. I therefore strongly recommend for users never to touch their htaccess file unless they are extremely confident in the changes they are making. Many websites will offer code snippets which can provide users a false sense of security and lead them to experiment. It is a bad idea to do such with the htaccess file.
-
I have
been programming since before the internet came to be, I have studied Computer
Science at University and passed numerous Microsoft Certifications, and while I
would not discount study, it is my experience that I have never met a great
programmer that did not learn by trail and error, after all this is how you
become experienced. There is no danger in using a backup, RegEx does not work
sometimes and not others, it is not dynamic, it is a static peace of code. You
will not excel at SEO unless you learn these things. I am sure Michael is capable
of deciding if he wants to do it himself, he seems to have got a long way already.
It would seem to me he is learning quite quickly. You may suggest that you
would not try, but I don’t think it is correct to forbid others. -
I'm sorry but the idea of advising users without expertise to modify their htaccess file is completely reckless. The trial & error approach can easily lead to circumstances where the rule works some of the time but not always. Worse, it can negatively impact other rules and site security causing major problems.
Without knowing the details of the site involved, I tend to make the safe assumption the site is important and there are one or more people who's livelihoods depend on the site. Having worked with clients who have recovered from the damage caused by errors in htaccess files I will firmly share my experience that no one other then a qualified expert should ever touch the file. The potential for damage is very high.
-
All he needs to do is keep a back up, and he can have as many tries as he wants. He simpley has to replace the file with his back up if he goes wrong.
There is little danger here. -
htaccess rewrite rules are based on Regex expressions. Your current Regex rewrite rules can be modified to adjust for the specific URLs. You need to locate an experienced programmer to write the expressions for you.
-
Hi Mchael.
Yes, you can use htaccess to rewrite or redirect the URL.
Where do these URLs presently lead to? If these URLs are duplicates for pages on your site, I would suggest using a 301 redirect to send the traffic to the proper URL rather then your home page.
If your server uses cPanel, there is a Redirect tool you can use. This tool makes the process of adding a redirect easier and safer then modifying your htaccess file. Your htaccess file controls various aspects of your site's security, accessibility and SEO. The slightest error can cause your site to instantly be inaccessible. I would not recommend making any changes to your htaccess file except by an experienced programmer. Even using the correct code in the wrong order can lead to problems.
-
I work on microsoft servers, i dont use .htaccess
but this is the rule i woudl write to fix all urls stating with index.php, no mater what the querystreing
<rule name="DefaultRule" stopprocessing="true"><match url="^index.php"><action type="Redirect" url="/" appendquerystring="false"></action></match></rule>
but try this let me know if it works i have a few other ideas
RewriteRule ^/index.php / [R=301,L]
-
Hi Alan,
I think it's now clear to me that they should be rewritten. Thanks for pointing me to the right direction.
I have a classified site and in my .htaccess I have these rewrite rules by default
RewriteRule ^/?(new)/(1_day)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=4 [L] ##category newest 1day
RewriteRule ^/?(new)/(1_week)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=1 [L] ##category newest 1week
RewriteRule ^/?(new)/(2_weeks)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=2 [L] ##category newest 2weeks
RewriteRule ^/?(new)/(3_weeks)/([0-9]+)/([^./\"'?#]+).html$ index.php?a=11&b=$3&c=65&d=3 [L] ##category newest 3weeksRewriteRule ^/?(new)/(1_day)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=4&page=$5 [L] ##category newest 1day pages
RewriteRule ^/?(new)/(1_week)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=1&page=$5 [L] ##category newest 1week pages
RewriteRule ^/?(new)/(2_weeks)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=2&page=$5 [L] ##category newest 2weeks pages
RewriteRule ^/?(new)/(3_weeks)/([0-9]+)/([^./\"'?#]+)/([0-9]+).html$ index.php?a=11&b=$3&c=65&d=3&page=$5 [L] ##category newest 3weeks pagesunfortunately, these rules could not handle all URLs of the same format with different variables like the following below
http://www.mysite.com/index.php?a=11&b=15&d=3&c=1
http://www.mysite.com/index.php?a=11&b=15&d=3&c=2
http://www.mysite.com/index.php?a=11&b=16&c=5&d=1
http://www.mysite.com/index.php?a=11&b=16&c=5&d=1&page=2
http://www.mysite.com/index.php?a=11&c=5&d=1&b=230
Any idea on how I can solve this problem to avoid duplicate content?
Thanks in advance...
-
rewrite and redirect are not the same thing. you want to 301 them, but better still why do you have them?
Do you have a wordpress site?If these errores were found by a crawler it means that you have the links on your site somewhere. the best thing to do is correct the links. 301's leak link juice you want to limit their number.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
Should you use www?
The age old question. Should I use "www." for a brand new content site assuming my goal (and most goals starting out) is to get to millions of visits per month? Does the community agree with, http://www.yes-www.org/why-use-www/? The only reason I question it honestly, since most high traffic companies in my search use www., is because moz doesn't. Thanks for your help. Seems it would be quite a pain to go back once you have a lot of traffic.
Intermediate & Advanced SEO | | mag7770 -
Ranking without use of keywords on page & without use of matching anchor text??
Howdy folks. So, here is a dilemma. One of competitors of ours is somehow ranking for a keyphrase "houston chronicle obituaries" without any usage of these keywords on the page, without any full or partial anchor text match ("chronicle" is not used anywhere). The rest of competitiors' rankings make sense. Any ideas?
Intermediate & Advanced SEO | | DmitriiK0 -
(Urgent) losing traffic after 301 redirect
We face a seo problem of losing traffic after 301 redirect.We have used 301 redirect from a sub-domain url to main domain, after a few month, we discovered that the traffic in google is dropped 40% as well as yahoo dropped 50% without reason, we have updated sitemap already, but we cannot find any reason for the traffic dropped till now..The original url (more then 5000 links)https://app.example.com/ebook Redirected Urlhttps://www.example.com/ebookThank you for your help!
Intermediate & Advanced SEO | | yukung0 -
Multiple 301 Redirect Query
Hello all, I have 2 301 redirects on my some of my landing pages and wondering if this will cause me serious issues. I first did 301 directs across the whole website as we redid our url structure a couple of months ago. We also has location specific landing pages on our categories but due to thin/duplicate content , we have got rid of these by doing 301's back to the main category pages. We do have physical branches at these locations but given that we didnt get much traffic for those specific categories at those locations and the fact that we cannot write thousands of pages of unique content content , we did 301's. Is this going to cause me issues. I would have thought that 301's drop out of serps ? so is this is an issue than it would only be a temporary one ?.. Or should I have 404'd the location category pages instead. Any advice greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Danger in using utm_source and utm_medium to track tens of thousands of cross domain redirects
We just merged with another company and are redirecting their domains (competitive/similar content) to our own. We'll have several domains, redirecting (301) several hundred thousand URL's to our domain (not all the same page, very unique mappings). Will adding utm_source, et al parameters to the URL's have a negative impact on how google transfers value to the pages based on the redirect authority passed? Any points of view? We have a self referencing canonical, but given that we have 90 million pages on the current domain (and climbing), seems like cleanest approach would be to not use redirects. Thanks, Jeff
Intermediate & Advanced SEO | | jrjames830 -
301 redirect with /? in URL
For a Wordpress site that has the ending / in the URL with a ? after it... how can you do a 301 redirect to strip off anything after the / For example how to take this URL domain.com/article-name/?utm_source=feedburner and 301 to this URL domain.com/article-name/ Thank you for the help
Intermediate & Advanced SEO | | COEDMediaGroup0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0