Adding Something to htaccess File
-
When I did a google search for site.kisswedding.com (my website) I noticed that google is indexing all of the https versions of my site.
First of all, I don't get it because I don't have an SSL certificate. Then, last night I did what my host (bluehost) told me to do. I added the below to my htaccess file.
Below rule because google is indexing https version of site - https://my.bluehost.com/cgi/help/758RewriteEngine OnRewriteCond %{HTTP_HOST} ^kisswedding.com$ [OR]RewriteCond %{HTTP_HOST} ^kisswedding.com$RewriteCond %{SERVER_PORT} ^443$RewriteRule ^(.*)$ http://www.kisswedding.com [R=301,L]
Tonight I when I did a google search for site:kisswedding.com all of those https pages were being redirected to my home page - not the actually page they're supposed to be redirecting to.
I went back to Bluehost and they said and 301 redirect shouldn't work because I don't have an SSL certificate. BUT, I figure since it's sorta working I just need to add something to that htaccess rule to make sure it's redirected to the right page.
Someone in the google webmaster tools forums told me to do below but I don't really get it?
_"to 301 redirect from /~kisswedd/ to the proper root folder you can put this in the root folder .htaccess file as well:_Redirect 301 /~kisswedd/ http://www.kisswedding.com/"
Any help/advice would be HUGELY appreciated. I'm a bit at a loss.
-
Hi Susanna,
Just wanted to post a quick update on this question since I understand the problem was resolved with the suggestions that we made after looking at the content of your .htaccess file.
The major issue with the original redirect was that the $1 had been omitted from the Rule. So the problem actually was not related to the secure protocol - just a coding error
Example code for the secure to non-secure redirect you needed to implement:
redirect away from https to http
RewriteEngine On
RewriteCond %{SERVER_PORT} ^443$ [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com$1 [R=301,L]Hope that's helpful for anyone who may be looking for help with the same issue in the future.
Sha
-
Hi Susanna,
If you have trouble getting someone to fix the .htaccess for you, feel free to private message the file and we'll be able to sort it out for you.
It is never advisable to treat rules in the .htaccess in isolation as the order in which they appear in the file will determine how things work as Ryan explained. There are also other things that will influence whether the redirects function the way you want them to such as rules added to the file or overwritten by standard installations such as Wordpress, Joomla etc.
Hope your provider can get it sorted out for you. If not, just let us know and we'll be happy to help.
Sha
-
Thanks Ryan. I appreciate that. I had no idea that copy and pasting could cause so many problems. I'll try and see if I can speak with a supervisor.
Have a great night/day!
-
Hi Susan.
I will share two items regarding your situation. First, the modifications to your htaccess file involve a replacement language called Regex. The code you are copying and pasting are regex expressions. Basically they say "when a URL meets this condition, rewrite the URL as follows....". In your case, the original regex expression was not correctly written (a common occurrence) so you did not receive the desired effect.
The changes need to be made in your .htaccess file. The htaccess file controls all access to your website. A single character out of place can mean your entire site is unavailable, URLs are improperly redirected, or security holes are open on your site. On the one hand, you can copy and paste code into your htaccess file and it may work. On the other hand damage can be done. It also makes a difference where you locate the code within the file. Varying the location alters the logic and can lead to different results.
Based on the above it is my recommendation your host make the changes. If your host is unwilling to help, I would recommend asking to be assigned to another tech or a supervisor. Most hosts are very helpful in this area. If the host is not willing to help, perhaps you can ask your web developer to make the change.
If you decide to make the change yourself I recommend you doing some online research into how the htaccess file works. You do not want to fly blind in this file.
Suggested reading: http://net.tutsplus.com/tutorials/other/the-ultimate-guide-to-htaccess-files/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL Crawl Reports providing drastic differences: Is there something wrong?
A bit at a loss here. I ran a URL crawl report at the end of January on a website( https://www.welchforbes.com/ ). There were no major critical issues at the time. No updates were made on the website (that I'm aware of), but after running another crawl on March 14, the report was short about 90 pages on the site and suddenly had a ton of 403 errors. I ran a crawl again on March 15 to check if there was perhaps a discrepancy, and the report crawled even fewer pages and had completely different results again. Is there a reason the results are differing from report to report? Is there something about the reports that I'm not understanding or is there a serious issue within the website that needs to be addressed? Jan. 28 results:
Reporting & Analytics | | OliviaKantyka
Screen Shot 2022-03-16 at 3.00.52 PM.png March 14 results:
Screen Shot 2022-03-15 at 10.31.22 AM.png March 15 results:
Screen Shot 2022-03-15 at 4.06.42 PM.png0 -
Transitioning to HTTPS, Do i have to submit another disavow file?
Hi Mozzers! So we're finally ready to move from http to https. Penguin is finally showing us some love due to the recent algorithm updates. I just added the https to Google search console and before 301 redirecting the whole site to a secure environment.....do i upload the same disavow file to the https version? Moreover, is it best to have both the http and https versions on Google Search console? Or should I add the disavow file to the https version and delete the http version in a month? And what about Bing? Help.
Reporting & Analytics | | Shawn1240 -
How to FILTER in Google Analytics an ad campaign from linkedin?
Hi mozzers We are setting up an a linkedin ad campaign for our agency and want to track its traffic and conversions. The linkedin ad will carry UTMs for each link. For tracking this campaign accurately I thought about creating a new GA View with a specific filter. So my question is about the filtering, should i use the INCLUDE, REFERRAL with pattern LINKEDIN.COM (see image)? if not what would be the best way to track this campaign? My other concern is that we are also running other a job ad on linkedin and I feel these hits will be tracked as well. Is there a way to separate those 2 campaigns? Thanks guys! MzE5hqE.png
Reporting & Analytics | | Ideas-Money-Art0 -
Best way to block spambots in htaccess
I would like to block Russian Federation, China and Ukraine spam as well as semalt and buttonsforwebsite. I have come up with the following code, what do you think? For the countries: BLOCK COUNTRY DOMAINS RewriteCond %{HTTP_REFERER} .(ru|cn|ua)(/|$) [NC]
Reporting & Analytics | | ijb
RewriteRule .* - [F] And for buttons-for-website.com and semalt-semalt.com: BLOCK REFERERS RewriteCond %{HTTP_REFERER} (semalt|buttons) [NC]
RewriteRule .* - [F] or should it be: BLOCK USER AGENTS RewriteCond %{HTTP_USER_AGENT} (semalt|buttons) [NC]
RewriteRule .* - [F] Could I add (semalt|buttons|o-o-6-o-o|bestwebsitesawards|humanorightswatch) or is that too many?0 -
Google Analytics - Adding a sub-domain
Hi I have a google analytics query.
Reporting & Analytics | | Niki_1
I have a main site with a google analytics tag and I have 2 forms that sit on a subdomain with a different GA code. As I would like to measure end to end tracking, I would like the same GA code on the subdomain. What is the best way for me to implement this? Would I need to make some changes to the GA code that sits on the main site or can I add the the GA code from the main site onto the subdomain? Thanks0 -
Longevity of robot.txt files on Google rankings
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice: An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server. Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
Reporting & Analytics | | gfiedel0 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
How to get a list of robots.txt file
This is my site. http://muslim-academy.com/ Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt In Google Webmaster its not showing up.Just giving the number of blocked URL's. Any plugin or Software to extract the list of blocked URL's.
Reporting & Analytics | | csfarnsworth0