Adding Something to htaccess File
-
When I did a google search for site.kisswedding.com (my website) I noticed that google is indexing all of the https versions of my site.
First of all, I don't get it because I don't have an SSL certificate. Then, last night I did what my host (bluehost) told me to do. I added the below to my htaccess file.
Below rule because google is indexing https version of site - https://my.bluehost.com/cgi/help/758RewriteEngine OnRewriteCond %{HTTP_HOST} ^kisswedding.com$ [OR]RewriteCond %{HTTP_HOST} ^kisswedding.com$RewriteCond %{SERVER_PORT} ^443$RewriteRule ^(.*)$ http://www.kisswedding.com [R=301,L]
Tonight I when I did a google search for site:kisswedding.com all of those https pages were being redirected to my home page - not the actually page they're supposed to be redirecting to.
I went back to Bluehost and they said and 301 redirect shouldn't work because I don't have an SSL certificate. BUT, I figure since it's sorta working I just need to add something to that htaccess rule to make sure it's redirected to the right page.
Someone in the google webmaster tools forums told me to do below but I don't really get it?
_"to 301 redirect from /~kisswedd/ to the proper root folder you can put this in the root folder .htaccess file as well:_Redirect 301 /~kisswedd/ http://www.kisswedding.com/"
Any help/advice would be HUGELY appreciated. I'm a bit at a loss.
-
Hi Susanna,
Just wanted to post a quick update on this question since I understand the problem was resolved with the suggestions that we made after looking at the content of your .htaccess file.
The major issue with the original redirect was that the $1 had been omitted from the Rule. So the problem actually was not related to the secure protocol - just a coding error
Example code for the secure to non-secure redirect you needed to implement:
redirect away from https to http
RewriteEngine On
RewriteCond %{SERVER_PORT} ^443$ [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com$1 [R=301,L]Hope that's helpful for anyone who may be looking for help with the same issue in the future.
Sha
-
Hi Susanna,
If you have trouble getting someone to fix the .htaccess for you, feel free to private message the file and we'll be able to sort it out for you.
It is never advisable to treat rules in the .htaccess in isolation as the order in which they appear in the file will determine how things work as Ryan explained. There are also other things that will influence whether the redirects function the way you want them to such as rules added to the file or overwritten by standard installations such as Wordpress, Joomla etc.
Hope your provider can get it sorted out for you. If not, just let us know and we'll be happy to help.
Sha
-
Thanks Ryan. I appreciate that. I had no idea that copy and pasting could cause so many problems. I'll try and see if I can speak with a supervisor.
Have a great night/day!
-
Hi Susan.
I will share two items regarding your situation. First, the modifications to your htaccess file involve a replacement language called Regex. The code you are copying and pasting are regex expressions. Basically they say "when a URL meets this condition, rewrite the URL as follows....". In your case, the original regex expression was not correctly written (a common occurrence) so you did not receive the desired effect.
The changes need to be made in your .htaccess file. The htaccess file controls all access to your website. A single character out of place can mean your entire site is unavailable, URLs are improperly redirected, or security holes are open on your site. On the one hand, you can copy and paste code into your htaccess file and it may work. On the other hand damage can be done. It also makes a difference where you locate the code within the file. Varying the location alters the logic and can lead to different results.
Based on the above it is my recommendation your host make the changes. If your host is unwilling to help, I would recommend asking to be assigned to another tech or a supervisor. Most hosts are very helpful in this area. If the host is not willing to help, perhaps you can ask your web developer to make the change.
If you decide to make the change yourself I recommend you doing some online research into how the htaccess file works. You do not want to fly blind in this file.
Suggested reading: http://net.tutsplus.com/tutorials/other/the-ultimate-guide-to-htaccess-files/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL Crawl Reports providing drastic differences: Is there something wrong?
A bit at a loss here. I ran a URL crawl report at the end of January on a website( https://www.welchforbes.com/ ). There were no major critical issues at the time. No updates were made on the website (that I'm aware of), but after running another crawl on March 14, the report was short about 90 pages on the site and suddenly had a ton of 403 errors. I ran a crawl again on March 15 to check if there was perhaps a discrepancy, and the report crawled even fewer pages and had completely different results again. Is there a reason the results are differing from report to report? Is there something about the reports that I'm not understanding or is there a serious issue within the website that needs to be addressed? Jan. 28 results:
Reporting & Analytics | | OliviaKantyka
Screen Shot 2022-03-16 at 3.00.52 PM.png March 14 results:
Screen Shot 2022-03-15 at 10.31.22 AM.png March 15 results:
Screen Shot 2022-03-15 at 4.06.42 PM.png0 -
Losing referrer data on http link that redirects to an https site when on an https site. Is this typical or is something else going on here?
I am trying to resolve a referral data issue. Our client noticed that their referrals from one of their sites to another had dropped to almost nothing from being their top referrer. The referring site SiteA which is an HTTPs site, held a link to SiteB, which is also an HTTPs site, so there should be no loss, however the link to SiteB on SiteA had the HTTP protocol. When we changed the link to the HTTPs protocol, the referrals started flowing in. Is this typical? If the 301 redirect is properly in place for SiteB, why would we lose the referral data?
Reporting & Analytics | | Velir0 -
Best way to block spambots in htaccess
I would like to block Russian Federation, China and Ukraine spam as well as semalt and buttonsforwebsite. I have come up with the following code, what do you think? For the countries: BLOCK COUNTRY DOMAINS RewriteCond %{HTTP_REFERER} .(ru|cn|ua)(/|$) [NC]
Reporting & Analytics | | ijb
RewriteRule .* - [F] And for buttons-for-website.com and semalt-semalt.com: BLOCK REFERERS RewriteCond %{HTTP_REFERER} (semalt|buttons) [NC]
RewriteRule .* - [F] or should it be: BLOCK USER AGENTS RewriteCond %{HTTP_USER_AGENT} (semalt|buttons) [NC]
RewriteRule .* - [F] Could I add (semalt|buttons|o-o-6-o-o|bestwebsitesawards|humanorightswatch) or is that too many?0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
What is the SEO Impact of Adding a Directory to URL
I would like to add a new directory named “products” for all of the product detail pages on my site. Instead of having the URL for a product be “mysite.com/product-details-page.aspx” we would like to change it to “mysite.com.com/products/product-detail-page.aspx.” I want to do this to enable us to add product pages to our traffic funnel analysis by filtering visits to the "product" directory - right now we can't track visits to product pages in the funnel because they are just one-off the main site. I know this change will require redirects for every single product. Is there anything else that needs to be done? My main question is, will this change negatively impact the SEO value of the product pages? We have several product pages ranking in the SERPs, and I don't know if pushing them one directory further will change that. Thanks for your input!
Reporting & Analytics | | pbhatt0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0