Moving to https with a bunch of redirects my programmer can't handle
-
Hi Mozzers,
I referred a client of mine (last time) to a programmer that can transition their site from http to https. They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages. Currently, the way EPS redirects is setup (as shown in the attachment) is simple:
On the left side you enter your old url, and on the the right side is the newly 301'd url. But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well. What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https.
As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS. But my client is worried the next plugin update he will lose all http to https redirects. While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website. Well, we don't want to lose all 400 301's and jeopardize our SEO.
Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls?
Thank you all!
-
Have you reached out to plugin support people?
-
Thank you for the reply @sys_admin
We are using cloudflare as well and use the https redirection. I didn't mention cloudflare earlier because i don't want to rely on a 3rd party to handle my redirects.
-
Because your CMS & SSL solution are not compatible in a nice way right now as you explained in detail.
My suggestion: ditch the crap plugin. Get you a cloudflare SSL, and setup your https and page rules properly and ball like a pro. You can try cloudflare for free just to play around with it and see what I mean, that will take care of all this. you can keep your redirects http > http and cloud flare can handle all http requests >> https by page rules and what not.
dig into it, I think this will help resolve your situation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTP vs HTTPS duplication where HTTPS is non-existing
Hey Guys, **My site is **http://www.citymetrocarpetcleaning.com.au/ Goal: I am checking if there is an HTTPS version of my site (duplication issue) What I did: 1. I went to Screaming Frog and run https://www.citymetrocarpetcleaning.com.au/. The result is that it is 200 OK (the HTTPS version exists - possible duplication) 2. Next, I opened a browser and manually replace HTTP with HTTPS, the result is "Image 1" which doesn't indicate a duplication. But if we go deeper in Advanced > Proceed to www.citymetrocarpetcleaning.com.au (unsafe) "Image 2", it displays the content (Image 3). Question: 1. Is there an HTTP vs HTTPs duplication here? 2. Do I need to implement 301 redirection/canonical tags on HTTPS pointing to HTTP to solve duplication? Please help! Cheers! uIgJv DsNrA El7aI
Intermediate & Advanced SEO | | gamajunova0 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Huge httaccess with old 301 redirects. Is it safe to delete all redirects with no traffic in last 2 months?
We have a huge httaccess file over several MB which seems to be the cause for slow server response time. There are lots of 301 redirects related to site migration from 9 months ago where all old URLs were redirected to new URL and also lots of 301 redirects from URL changes accumulated over the last 15 years. Is it safe to delete all 301 redirects which did not receive any traffic in last 2 months ? Or would you apply another criteria for identifying those 301 that can be safely deleted? Any way to get in google analytics or webmaster tools all 301 that received traffic in the last 2 months or any other easy way to identify those, apart from checking the apache log files ?
Intermediate & Advanced SEO | | lcourse0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Is it worth redirecting?
Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?
Intermediate & Advanced SEO | | dieselprogrammers0 -
When you can't see the cache in search, is it about to be deindexed?
Here is my issue and I've asked a related question on this one. Here is the back story. Site owner had a web designer build a duplicate copy of their site on their own domain in a sub folder without noindexing. The original site tanked, the webdesigner site started outranking for the branded keywords. Then the site owner moved to a new designer who rebuilt the site. That web designer decided to build a dev site using the dotted quad version of the site. It was isolated but then he accidentally requested one image file from the dotted quad to the official site. So Google again indexed a mirror duplicate site (the second time in 7 months). Between that and the site having a number of low word count pages it has suffered and looked like it got hit again with Panda. So the developer 301 the version to the correct version. I was rechecking it this morning and the dotted quad version is still indexed, but it no longer lets me look at the cache version. Out of experience, is this just Google getting ready to drop it from the index?
Intermediate & Advanced SEO | | BCutrer0 -
I'm having an exteremly hard time with SERPs despite my best efforts. Can someone help?
My site is www.drupalgeeks.org Our traffic is going up but our SERPS are not. We simply don't rank for any of our targeted keywords. I have covered nearly every white hat SEO strategy possible. Our site has a great social presence (Facebook, Twitter, LinkedIn, Pinterest), we write blogs regularly, and even guest blog. We have a YouTube channel, an RSS feed. We've cleaned up page speed times, set 301 redirects, checked for duplicate content. We use Bing and Google webmaster tools and have submitted a sitemap. We are indexed and webmaster tools see our keywords as relevant in our content. We have a robots.txt file configured properly. The only thing I can think of is that our services pages also display (as a truncated summary) on our homepage. Could this be considered duplicate content, and is this causing a problem? Is there anything else we can do? Or are we missing something vital? We thank you in advance for your help! Candice
Intermediate & Advanced SEO | | candylotus0 -
SEO and marketing for a company that doesn't want to promote their primary website
Hi All! One of my new clients is in a semi-grey-hat industry, and is in perpetual danger of having their real websites (of which they have several), blocked by the Chinese firewall (which is where their target market is). So their idea is to use neutral sites to write information (Squidoo, article site, maybe a stand-alone WP site with a few pages) and promote those pages. The idea being that China is less likely to block those sites, and then the link to the actual website from those pages could always be changed if China blocks the website listed. I'm a little dubious as to how feasible this is - how do you promote a Squidoo page? Or an article on an article site for semi-competitive keywords? Besides on-page SEO (which may not be enough), is there anything you can really do post-Penguin? If anyone has any ideas as to the above - or as to how else to effectively market sites when you can't market the site and brand directly, I'd be very happy to hear. Thanks!
Intermediate & Advanced SEO | | debi_zyx0