Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
301 redirects from several sub-pages to one sub-page
-
Hi!
I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed.
These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4.
I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages.
Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file?

You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country.
Hope someone here can light up my path
Thats at least something you can say in norwegian... -

-
OMG you just made my day! Thank you so much

-
Hi,
Use;
Redirect 301 /olddirectory/example.html http://www.example.com/newdirectorylocation.html
keep in mind that it is relative to document root, and if you have spaces in any of these, you will need to put quotation marks around like so...
Redirect 301 "/olddirectory/example 2 of 2.html" http://www.example.com/newdirectorylocation.html
hope this helps!
w00t!
if you want to do some more research http://www.webweaver.nu/html-tips/web-redirection.shtml
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site auto redirects http to https. This is causing redirect chains. What can I do?
I noticed that Moz flags a lot of redirect chain issues on my site. I realized that this is mostly because the site automatically redirects http to https, and when I create a new URL (when a URL changes, for example) it is automatically flagged as a chain. Example: http://www.example-link Auto directs to: https://www.example-link Which is then redirected to: https://www.example-link-changed (when the address actually changes) I don't seem to have any control over changing where the initial http redirect goes. Any advice on fixing this problem?
On-Page Optimization | | baystatemarketing0 -
301 Redirects - Large .htaccess file question
We are moving about 5000 pages from root into different folders. We need to individually 301 each page because the are sitting at root level now: mysite.com/page.com We want to move them to: mysite.com/folder/page.html etc I dont think redirect match can works because of the different files names and folders they are being moved in to. Will 5000 entries in .htacess slow site loading? Any other suggestions how to handle?
On-Page Optimization | | leadforms0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
How long should I leave an existing web page up after a 301 redirect?
I've been reading through a few of blog posts here on moz and can't seem to find the answer to these two questions: How long should I leave an existing page up after a 301 redirect? The page old page is no longer needed but has pretty high page authority. If I take the old page down—the one that I'm redirecting from—immediately after I set up the 301 redirect, will link juice still be passed to the new page? My second question is, right now, on my index.html page I have both a 301 redirect and a rel canonical tag in the head. They were both put in place to redirect and pass link equity respectively. I did this a couple years back after someone recommended that I do both just to be safe, but from what I've gathered reading the articles here on moz is that your supposed to pick one or the other depending on whether or not it's permanent. Should I remove the rel conanical tag or would it be better to just leave it be?
On-Page Optimization | | ScottMcPherson0 -
How much juice do you lose in a 301 redirect?
Our site has a number of, shall we say, unoptimized URLs. I would like to change the URLs to be more relevant; if a page is about red widgets, the URL should be www.domain.com/red-widgets.html, right? I'm getting resistance on this, however, based on the belief that you lose something significant when you 301 an old URL to a new one. Now, I know that if you have a long chain of redirects, the spiders will stop following at some point, and that is a huge problem. That wouldn't apply if there's only one step in the chain, however. I've also heard that you lose some link juice in a 301, but I'm unsure how serious that problem actually is. Is it small enough that we'd win out in the long run with better-optimized URLs?
On-Page Optimization | | CMC-SD0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
301 redirect and then keywords in URL
Hi, Matt Cutts says that 301 redirects, including the ones on internal pages, causes the loss of a little bit of link juice. But also, I know that keywords in the URL are very important. On our site, we've got unoptimized URLs (few keywords) in the internal pages. Is it worth doing a 301 redirect in order to optimize the URLs for each main page. 301 redirects are the only way we can do it on our premade cart For example (just an example) say our main (1 of the 4) keywords for the page is "brown shoes". I'm wondering if I should redirect something like shoes.com/shoecolors.html to shoes.com/brown-shoes.html In other words, with the loss of juice would we come out ahead? In what instances would we come out ahead?
On-Page Optimization | | BobGW0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5