Urls rewriting "how to" with .htaccess
-
hi,
Please i would need advices (links, tips, tool:generator ?) regarding url rewriting through .htaccess (newbee about it).
It's a "refurbishing" website case , the domain doesn't change. But the CMS does !
I've got a list of urls (800) with which i don't want to loose rankings on :
Here the type of old url syntax :
http://www.mydomain.com/home/newscontent.asp?id=1133
Here the new url type would be:
http://www.mydomain.com/name-of-the-article
or/and
http://www.mydomain.com/category/Page-2Tks a lot...
-
You should get all the url of the old site with Xenu's Link Sleuth, then create a PHP array of oldUrl => newUrl and put it in your redirect script.
So you have in the htaccess :
RewriteCond %{REQUEST_URI} ^/home/newscontent.asp
RewriteCond %{QUERY_STRING} id=([0-9]+)
RewriteRule ^(.*)$ redirect.php?id=%1 [L]In the redirect.php file, you have :
$redirect = array("/home/newscontent.asp?id=1133" => "/name-of-the-article"); // 800 times (for all url)
if(isset($redirect[$_SERVER['REQUEST_URI']])) {
header("Status: 301 Moved Permanently", false, 301);
header("Location: http://www.mydomain.com/".$redirect[$_SERVER['REQUEST_URI']]);
exit();
}// Send a 404 if you don't have a redirect
-
Hi, i was thinking of the whole picture of baptiste solution, you say :
"Baptiste: On the new linux hosting set up an .htaccess file in the root of the site directory that redirects all id=xxxx requests to a redirect.php file on your server. The redirect.php file will need to interrogate a database with a table of the mappings and automatically redirect to the correct page via php scripting."
it means that wiithout any credentials, any database access, if you have urls from the site you need to move to, you can redirect any urls site to another one !?
Hum..i think i miss something ..
-
Good idea..i'll to make it so , and use excel function.....tks
-
Many tks for all these explanations..
So, in fact, lazily speaking, i would say that the .htaccess file solution give less work to do (no redirection script) and seems to be quite easy to make (excepting syntax inside .htaccess), so i 'll go for Damien's ..but i need credentials to install it.
Otherwise, if i don't, I'd go for Baptiste's...
Tks a lot...
-
As you have only 800 urls, I agree with Damien, you should generate an associative array in pure php, associating every ID with the new url.
The redirect script will only test if the ID is an array key, if it is you 301 to the new url. Otherwise, display a 404 page.
-
OK in that case it simplifies things a bit.
In order to do any redirection from id=1136 to unique-article-name you will haveto create the mappings entirely manually.
The two solutions provided are:
Baptiste: On the new linux hosting set up an .htaccess file in the root of the site directory that redirects all id=xxxx requests to a redirect.php file on your server. The redirect.php file will need to interrogate a database with a table of the mappings and automatically redirect to the correct page via php scripting.Mine: essentially the same as Baptiste's proposal, except that you don't interrogate the database, all the redirections are done using the htaccess file which contains all the mappings.
Either way you will need to manually create the mappings yourself, either in the database or in the htaccess file.
EDIT: Just had a thought, are the page titles of the articles the same between the new site and the old? If they are then you could crawl both sites with Xenu and then use vlookups in excel (or similar) to semi-automatically create your mapping of id = unique-article-name.
-
I'd say yes for the first one and for sure no for the second one...:)
-
To be honest, this is the solution I'd go for.
Mozollo, was your old site database driven?
Are you using the old article titles as the new page names?
If the answer is no to either of these, then the end result is you will have to manual map id to page name for each of the 800 pages you want to keep.
-
Tks again, so (sorry to repeat)
-
your solution : 1 .htaccess + redirect.php : located at the root of windows platform
-
Damien's : 1 .htaccess :located at the root of windows platform
Is that correct ?
-
-
1. .htaccess won't exist on the windows platform unless you installed a rewrite mod on the windows server. If you did then the .htaccesswill be in the root folder of the website (usually) you should check the documentation of the rewrite mod to confirm that.
2. If you have a windows PC then Xenu's Link Sleuth should be able to crawl the old site, you can then extract the information from the files that xenu can export.
3/4. If every unique id needs to get mapped to a unique url then yes, 800 times it is. If you have multiple ids that go to the same page you could do:
RewriteCond %{QUERY_STRING} ^id=113[3-8]$ [NC]
RewriteRule ^newscontent.asp$ ^name-of-the-article$ [L,R=301]
All ids from 1133 to 1138 will now redirect to the same page, you'll have to work out the regexs though.
-
To be clear about the different roles of the files in my solution, the .htaccess file will redirect every old url (whatever the id is) to a redirect script written in php.
This script will get the old url Id, load the article (to get the article name) and then redirect 301 to the new url. Only in php can you access the database.
Damien gave another solution, only based on htaccess. You have to write (or generate with code / software) 800 redirect directive for the htaccess file.
-
Tks to you both Baptiste placé Damiens Phillips and.
What do you mean when you say :
"The redirect.php file will load the article (or category as I understood) and do a 301 to the new url."
Is it en .htaccess file to create or a dedicated file.php , or both (redirect.php) ?
Yes, i'll all have to transfer each old article and i'll give them an unique urls per article..hope that reply your question !
-
Can you be a bit more precise about the new url ? Does every old article with id has to 301 to a page with a unique name ?
-
Hi,
Tks to you both Damiens Phillips and Baptiste placé.
But it seems to be a bit confusing for me for 2 reasons : language + technical knowledge !
I confirm that i'll move from windows platform to linux one.So if i understand :
1/ - htaccess is possible but where will it be located ? I assume at the root of the old platform (windows here..).
2/ - I'll have to crawl each article in order to get each id (by the way, have you got any crawler tool to advise ?)
3/ - For each of these urls i'll have to write such syntax :
RewriteCond %{QUERY_STRING} ^id=1133$ [NC]
RewriteRule ^newscontent.asp$ ^name-of-the-article$ [L,R=301]4/ ...800 times ? Or is there a way to tell on 1 line like :
RewriteCond %{QUERY_STRING} ^id=1133$ + ^id=1134$ + ^id=1197$ ...... [NC]Tks a lot again
-
I'll return the favour if it turns out he has moved from IIS
-
That's right but htaccess was asked. Thumbed up your answer so it goes first
-
But only if he's moved from Windows IIS hosting to Linux or Windows + PHP!
-
True ! The good syntax is :
RewriteCond %{REQUEST_URI} ^/home/newscontent.asp
RewriteCond %{QUERY_STRING} id=([0-9]+)
RewriteRule ^(.*)$ redirect.php?id=%1 [L] -
He'll need to add [L,R=301] at the end instead of just [L]. IIRC default behaviour is a 302 redirect.
You also can't reference a querystring in the RewriteRule, you have to use RewriteCond.
-
Hi,
From the .asp in the sample URLs I'm guessing you're hosted on Windows, if that's the case you'll need to get a rewrite mod for IIS such as ISAPI Rewrite 3. We've been using it for about 5 years now and it performs well. Their site has documentation that shows how it works.
You'll need to learn about regex expressions and a tool like Regex Buddy might be helpful.
I'm not aware of an tools that can automate generation, and I think that in your case you're going to need to do some manual work to set it up.
First you'll need a way of linking the old URLs to the new ones. Given the information you've provided, it's not clear how you'll be able to do this, so I'll make an assumption.
Assuming that name-of-the-article is the same as the title of newscontent.asp?id=1133, you'll need to generate a list, in excel for example, that lists the old contentid and the title of that document. You can then use formulae/macros to generate the rewrite rules which you would enter in the .htaccess file.
If you don't have a record of the id = title relationship in your old cms database (assumption!) then you might be able to do it by crawling the old site with a crawling program, exporting the data and then manipulating it. Otherwise you'll have to do it all by hand.
Rewrite rules generally take the form:
RewriteRule oldpageaddress newpageaddress [flags]
You'll also need to use the RewriteCond in order to base the rule on the querystring.
So for your example;
RewriteCond %{QUERY_STRING} ^id=1133$ [NC]
RewriteRule ^newscontent.asp$ ^name-of-the-article$ [L,R=301]
You'd then need to repeat those two statements for each page you want to redirect.
-
Hi mozllo,
You won't be able to create a .htaccess for such urls, because the original url only has the ID of the article and you want the name of the article in the new url. This requires database access to know the new url.
I would suggest to put in your htaccess file :
RewriteRule ^home/newscotnent.asp?id=([0-9]+) redirect.php?id=$1 [L]
Edit : see good rule below
The redirect.php file will load the article (or category as I understood) and do a 301 to the new url.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I include "commissioned" posts in my link building strategy?
Ok "Commissioned Posts" meaning industry influencers/bloggers etc writing about your brand, products or services and possibly "linking" to your website (in exchange for money, or not) a) I'm contacted by a blogger who wants to write a piece about our product and naturally links back. b) A blogger says, yes, we can write a fantastic article about your brand and link to you for $$$..$ - is this ok if not at scale? What is deemed as ethical? I want to make sure our link building campaign is done within Google's guidelines. Here is currently what we are doing, or intending to do; 1. We're producing unique content on our site and sharing this with key influencers organically on Twitter, Facebook and G+ communities. This so far is working well for a new start up. 2. Writing guest posts on authoritative sites (with only our author bio at the bottom, branded link to our site, social links) sharing knowledge or interesting content which readers will want to read. Sites like HuffPost, The Guardian would be great although we're starting on authoritative well maintained blogger sites within the industry to begin. 3. Reaching out to industry influencers who may like to review our products. Many of them have got back to me stating that they "can" run commissioned posts (normally requires a large fee) which carries a followed link, branded or unbranded. Although we may have initially contacted them, and money could be exchanged, in the eyes of Google wouldn't this appear as a natural post? Please let me know your thoughts on this? It would be great to gain more of an understanding exactly what I can or cannot do when it comes to developing high quality links for our business! Your feedback (sharing any examples if possible) would be truly appreciated. Thanks Gary
Link Building | | GaryVictory0 -
Gifly.com: url structure for new site with only one kind of page
Hi, we are launching a new site called Gifly.com, it's all about Gifs. At the moment, our robots.txt is blocking everything because we want to have a solid url structure before going public. There is only one kind of page, the permalink which has this form: http://gifly.com/QPmf/ (the four characters 'QPmf' are completely random) What we made different from the multitude of other Gifs site is that we are
Link Building | | mylittlepwny
trying to categorize gifs with hashtags, so we introduced these kind of urls: http://gifly.com/FoYi/
http://gifly.com/FoYi/#fail
http://gifly.com/FoYi/#blonde
http://gifly.com/FoYi/#dumb This was done for several reason, the first is that each gif can have up to 10
hashtags. Another approach to this could have been: http://gifly.com/fail/FoYi/
http://gifly.com/blonde/FoYi/
http://gifly.com/dumb/FoYi/ ...but we will face tons of duplicated contents for search engines because the same gif
would have 10x permalinks this way. The way we use to navigate between hashtags is via buttons/ajax (see the list of hashtags on the left)
We know that google may have a hard time understanding the site's structure, We'd like to know if the current set of urls, as: http://gifly.com/FoYi/
http://gifly.com/FoYi/#fail
http://gifly.com/FoYi/#blonde
http://gifly.com/FoYi/#dumb could be okay if we provide google a solid sitemap. We are thinking about a first sitemap with the list of hashtags,
each hashtag points to a second sitemap containing the list of new gifs uploaded for said hashtag. Thanks! gifly_logo_r_a.gif0 -
Does the feature "Competitive Domain Analysis " include subdomains?
Hey guys, I have set our domain as www.domain.com at the campaign settings, but i'm not sure if the link analysis ("Competitive Domain Analysis") does include subdomains? Does anybody know this? Greez Chris
Link Building | | lordcyphon0 -
Are moderated external links in a "showcase" worth having?
We have some e-commerce software with several dozen websites that use our technology. The websites have links to the developer and design shops that made the site as well as the site itself. All of these sites are moderated so we don't have to worry about spam. My question is should some or all of these links be rel=nofollow? The links are definitely pointing to legit development shops but those sites aren't always strictly related to e-commerce or the keywords we're trying to rank for. Any thoughts on how to treat these links?
Link Building | | schof0 -
Keyword + "intitle:resources"
Will Google penalise if I earn links with keyword + "intitle:resources" query?
Link Building | | JohnChena0 -
URL shortener and backlinks
Are there any SEO benefits drawn from creating backlinks with Google url shortener service(especially for the very deep links) or for the do-follow link juice I have to use a complete address?
Link Building | | SirMax0 -
Searching for Quality "Follow" Back Links
I'm in a highly competitive national market where the top sites have links from between 325 and 1300 unique linking root domains, therefore, you have to have an aggressive approach just to get on the map. (I'm at 317) If we were talking about needing 50 good links, I could take the time to cultivate relationships, get to know people, and get 1 or 2 great links from each webmaster, but the scale of the challenge is out of control. My competitors, and myself, seem to all be getting links in the following ways: Hoards of directory links. Some high quality paid links from industry sites ($2,400 per each link per year) and hundreds from 9-$49 per year. At the bottom of the list of most all my competitors, there appears to be some links from their early beginnings that were reciprocal linking arrangements. Blogs where they submitted articles and have good links back to their sites. Paid ads on sites all over the internet that link back with their specific key words. Some from relevant sites, but mostly from sites that would give them a good deal and have high enough traffic and/or page rank. Blog comments with a link back to their site; sometimes with good anchor text and sometimes you're forced to have to use your web site address as the anchor text or even your name. (Does that even do any good?) My dilema is where to find 1,000 good places to get links and I don't do black hat? I can write good quality comments on blogs from a wide variety of industries, but most are now eliminating the possibility of using my anchor text other than my web site and my name. As I scour the playing field, it almost appears that it has become a "pay to play" proposition as far as getting links everywhere other than writing good blog articles, but then what good does it do to have 500 blog articles coming from a handful of linking root domains? You're just stuffing the ballot box! As for me, I'm in the teens with all the high value phrases I need and must come up with a better strategy for the home stretch. In all the other varied statistical measurements that I see on SEO Moz, I'm no lower than #5 out of the top 10 competitors in any of them except Alexa rank. So, I'm close but it seems so far away! Would appreciative and be grateful for some wisdom from the community! Lowell
Link Building | | lwnickens1