301: Dynamic URL to Static Page
-
I've been going around trying to get this dynamic url to redirect in the .htaccess file. I know I'm missing something but can't figure it out.
Code:
RewriteEngine on
RewriteCond %{QUERY_STRING} ^/dynamic-url.php?id=43$
RewriteRule ^$ http://static/page/url/inserted/here? [R=301,L]Suggestions?
-
Hi there,
If it's only that URL (not a group of URLs with a pattern to follow) that you need to 301-redirect then you can directly use:
RedirectMatch 301 /dynamic-url.php?id=43 http://www.yourdomain.com/static/page/url/inserted/here
If this is not the situation please let me know!
Thanks,
Aleyda
-
In that case try my rewrite, like this and escape your metacharacters like . / - etc..
RewriteEngine on
RewriteCond %{HTTP_HOST} ^domain.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.domain.com$
RewriteRule ^dynamic-url.php?id=43$ "http://static/page/url/inserted/here" [R=301,L]edited that I forgot to escape the - and . in dynamic-url.php
-
It is the first one unfortunately it doesn't seem to be working for me.
-
Ahh yes I know what you mean that way. I meant programatically. Are you going to go through all the urls and manually assign them a redirect or are you trying to do one rewrite for multiple urls?
If the first one you could just use the example I gave you and modify it.
Also only just noticed I see you haven't escaped your metacharacters. This an example from mine (not using dynamic urls):
RewriteCond %{HTTP_HOST} ^domain.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.domain.com$
RewriteRule ^Old-Page.html$ "http://www.domain.com/New-Page.html" [R=301,L] -
I'm trying to take a dynamic url and redirect (301) it to a static url
dynamic: /dynamic-url.php?id=43
-
No sure exactly what you are trying to do...
this is something similar does it help? (from Stack Overflow)
RewriteCond %{QUERY_STRING} ^route=product/category&path=35&page=([0-9]+)$
RewriteRule ^index.php$ /product/category/35/page_%1? [R=301,L]Adam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating 301 Redirects to Decrease Page Load Times - Major Concerns?
Hello, I am being pushed to consolidate our over 6k redirects that have accumulated over the course of 4 years. These redirects are one of the many factors causing extensive load times for our website. Many to most or over a year old, have not been used, or simply redirect back to the home page. Other than looking to keep the pages that have external links (also looking for recommendations/tools), are there other best practices from an SEO stand point to ensure there are no major hits to our website. A little more info, I am looking to pair 6K down by Removing all Redirects that have not been used Removing all redirects that are over 1 yr+ Remove all redirects that redirect to simply the home page or a smaller big bucket subfolder
Technical SEO | | Owner_Account
This should take the number from 6K to around 300. Are there any major concerns? Pat0 -
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
What is the best URL designed for a product page?
Should a product page URL include the category name and subcategory name in it? Most ecommerce platforms it seems are designed to do have the category and sub-category names included in the URL followed by the product name. If that is the case and the same product is listed in more then 1 category and sub-category then will that product have 2 unique urls and as a result be treated as 2 different product pages by google? And then since it is the same product in two places on the site won't google treat those 2 pages as having duplicate content? SO is it best to not have the category and sub-category names in the URL of a product page? And lastly, is there a preferred character limit for a URL to be less than in size? Thanks!
Technical SEO | | gallreddy0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0