Htaccess file help
-
Hi, thanks for looking
i am trying (and failing) to write a htaccess for the following scenario
<colgroup><col width="679"></colgroup>
http://www.gardening-services-edinburgh.com/index.html http://www.gardening-services-edinburgh.com
http://www.gardening-services-edinburgh.com/
|
so that all of these destinations goto the one resource
any ideas?
thanks
andy
-
When I try the url mobile.gardening-services-edinburgh.com/index.html it seems it's being redirected to http://mobile.gardening-services-edinburgh.com/Mobile/ which is generating a 302 to the lost page.
Could you try to take out the last 2 lines - they don't seem to work (adding the trailing slash) and I fear that they are generating this issue.
Like most people on this forum I am not really an expert in regex - you could always try to put your question on a forum like stackoverflow - which is much more technically oriented.
Dirk
-
Hi tried that
heres the code for the.htaccess file
the problem is when you goto our mobile.gardening-services-edinburgh.com/index.html it comes up with a missing webpage
however, when you goto mobile.gardening-services-edinburgh.com without the /index.html it comes up with the correct webpage
any ideas?
thanks
#######################################################################
Diese .htaccess wurde vom STRATO-Webservermanager erstellt
#######################################################################
<ifmodule mod_expires.c="">ExpiresActive On
ExpiresDefault A86400Expires after 1 day
ExpiresByType image/gif A86400
ExpiresByType image/png A86400
ExpiresByType image/jpg A86400
ExpiresByType image/x-icon A86400
ExpiresByType application/pdf A86400
ExpiresByType application/x-javascript A86400
ExpiresByType text/plain A86400Expires after 1 day
ExpiresByType text/css A86400</ifmodule>
mod_gzip_on Yes
<ifmodule mod_deflate.c=""><filesmatch ".(js|css|html|jpg|png|php)$"="">SetOutputFilter DEFLATE</filesmatch></ifmodule>Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^gardening-services-edinburgh.com [NC]
RewriteRule ^(.*)$ http://www.gardening-services-edinburgh.com/$1 [R=301,L]RewriteRule ^index.html$ / [R=301,L]
RewriteRule ^(.*)/index.html$ /$1/ [R=301,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*[^/])$ /$1/ [L,R=301]ErrorDocument 401 http://www.gardening-services-edinburgh.com/Error-Lost-Page.html
ErrorDocument 403 http://www.gardening-services-edinburgh.com/Error-Lost-Page.html
ErrorDocument 404 http://www.gardening-services-edinburgh.com/Error-Lost-Page.html
ErrorDocument 500 http://www.gardening-services-edinburgh.com/Error-Lost-Page.html
-
Hi,
Normally these rules should do the trick:
RewriteEngine On
removes the index (source:http://dense13.com/blog/2012/10/29/removing-index-html-with-mod_rewrite-in-htaccess/)
RewriteRule ^index.html$ / [R=301,L]
RewriteRule ^(.*)/index.html$ /$1/ [R=301,L]adds a trailing slash (source: http://stackoverflow.com/questions/21417263/htaccess-add-remove-trailing-slash-from-url)
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*[^/])$ /$1/ [L,R=301]Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large Domain Authority Drop - Please Help
Hey there all, We are having the toughest time trying to figure out why our domain authority went from 12 to 3 with a search visibility score of literally zero. Back in Feb when the D.A.'s were all updated, we went down while all our competitors went up. We've been stuck at 3 for a few months and we can't understand why and aren't sure if we are dealing with a penalty of sorts. www.skycraftstudios.com is our site. We have a total of 60 some odd links in Search Console (some are garbage that we have disavowed, others are quality) but none of them are getting picked up in the newer MoZ index. We have added a few quality links lately, and even sped up our site quite a bit in conjunction with standard best practice optimizations, and even added an SSL cert, yet we stuck at a terrible D.A. of 3 and aren't even able to get into the top 10 pages of our main targeted term which seems incredibly odd to us. The site has been up for almost 2 years. Could this be simply a matter of not enough quality inbound links from the index? Any insight here would be appreciated.
Technical SEO | | SkycraftNate1 -
Help with onpage keyword optimization, site architecture, and how those aspects affect the SERPs.
Hey guys, I've made a post or two before, but my story is that I've been learning SEO for a while now and have only recently (in the last four months) had the opportunity to actually apply what I've been reading about. What I've learned while trying to put these things into practice is that it can be pretty tough sledding, even when it comes to basic elements like keywords and search results. Anyway, to the good stuff. I've been helping my brother's startup company in my spare time because I want them to do well. They're on the last legs of their series A funding and have no money to put towards SEO, content marketing or social, so I'm helping when and where I can for free. The company is Maluuba, a siri-like personal assistant app for Android with a ton of different domains. They launched at TechCrunch Disrupt and actually have a lot of traction and a fair amount of publicity, so I'm not exactly working with scraps, but I don't work with them in their offices and only really communicate with my brother, who is having a really hard time getting buy-in for some of the stuff I want them to do. Their initial website was pretty terrible, so my brother got the okay to redesign the site and together, we worked with a designer to implement the site I linked to. Because they have so many domains (search, social, organization) I thought creating specific pages along with a one homepage would be a good way to optimize for different things and funnel a wider audience to convert to the one macro goal of the site: getting people to download the app. The results haven't been exactly what I expected and I fear I didn't really implement what I still think is a good plan correctly. I've only tried to optimize the pages for a few keywords to start. The main keyword for the homepage and indeed the brand is 'personal assistant app' which is a fairly competitive keyword that I know have them ranking second for on Google CA. I used 'siri-alternative' as a secondary keyword, since that's how they label themselves in the Play Store. For the three other main (pages search, social, organization) I used 'personal assistant app' as a secondary keyword and tried to optimize each page for 'search app', 'social app' and 'organizer app', respectively. While I'm really quite proud that I managed to get a page ranking in the top three for our main keyword, I'm just as disappointed that it's the search page and not the homepage, mainly because I have no idea why it's happening. So, all of that to ask a few questions: Did I make a mistake by trying to add funnels to the site? Or did I just go about optimizing the pages incorrectly? Why does the search page rank really, really well for 'personal assistant app' while the other pages - including the one I intended to rank the highest for that term - lag behind? I'd guess that Google is indexing this page alone as the main representative of 'personal assistant app', but that wasn't my intention. I'm also not using any rel=canonical tags, if that matters. Also, this page has been flipping around in the 1-3 range in the SERPs for about a month, but I still haven't noticed any traffic from 'personal assistant app'. Alright, this is getting way to long. I'd very much appreciate any and all insights as to what I'm doing wrong or what I'm missing. It could be really obvious and thus make this post silly, but I really have read and tried to learn a lot. I just can't see what's going on here because I don't have any experience to compare it to. Thanks in advance for any help. Cheers, JD
Technical SEO | | JDMcNamara1 -
.htaccess: Multiple URLs catches filename
Hi, I have the following line in my .htaccess:
Technical SEO | | rasmusbang
RewriteRule privacy stdpage.php?slug=privacy [L] So if you go to the www.mysite.com/privacy it takes the stdpage.php with the argument above. But if you go to www.mysite.com/privacysssssssss catches the same file. How can I prevent this? It will give me multiple URLs with the exact same content. I have a 404 page which i would like to show instead when the match is not 100%. -Rasmus0 -
Getting a link removed from brand search - please help!
Hello all you mozzers! Ive just come into work with an established company who have one major problem when you google "palicomp" the second link that comes up is to consumeractiongroup with a thread that has been damaging the business for over 2 years, this thread is absolutely not representative of the business today. Strangely stronger links in search have better authority but google has ranked this post as being highly relevant to the business, does anybody know of any strategies we can do to get this removed, we have contacted consumeractiongroup directly but they are not prepared to move it. Does anyone have any idea of removal ideas or what we can do its crippling our business, we cant work out as to why its ranking better! Chris
Technical SEO | | palicomp0 -
Help needed please with 301 redirects in htaccess file.
In summary, we're currently having issues with our htaccess file. 301 redirects are going through to the new described URL but in addition the new URL is followed by a ? and the old URL. How can we get rid of the ? and previous URL so they don't appear as an ending. None of the examples we've found re this issue online appear to work. Can anyone please offer some advice? Can we use a RewriteRule to stop this happening? Here's a summary of the htaccess file REDIRECT CODE BEGINS HERE LONG LIST OF REDIRECTS, which appear to be set up perfectly fine. REDIRECT CODE ENDS DirectoryIndex index.php <ifmodule mod_rewrite.c="">RewriteEngine On Options +FollowSymLinks
Technical SEO | | petersommertravels
DirectoryIndex index.php
RewriteEngine On
RewriteCond $1 !^(images|system|themes|pdf|favicon.ico|robots.txt|index.php) [NC]
RewriteRule ^.htaccess$ - [F]
RewriteRule ^favicon.ico - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ /index.php?/$1 [L]</ifmodule> DirectoryIndex index.php0 -
Internal file extension canonicalization
Ok no doubt this is straightforward, however seem to be finding to hard to find a simple answer; our websites' internal pages have the extension .html. Trying to the navigate to that internal url without the .html extension results in a 404. The question is; should a 401 be used to direct to the extension-less url to future proof? and should internal links direct to the extension-less url for the same reason? Hopefully that makes sense and apologies for what I believe is a straightforward answer;
Technical SEO | | jg1000 -
Limits to 301 in htaccess?
I'm about to launch a redesign of my company's main website, and we've updated most of the URLs to be more user friendly and SEO optimized. I've just finished editing my spreadsheet, and see that I'll need to implement 244 redirects. My question is: Are there performance issues with loading your .htaccess file up with almost 250 301 redirect commands? I've heard a bloated htaccess file can really slow down apache, should I be approaching this a different way, maybe with php?
Technical SEO | | AdoptionHelp0