De-indexing and SSL question
-
Few days ago Google indexed hundreds of my directories by mistake (error with plugins/host), my traffic dropped as a consequence.
Anyway I fixed that and submitted a URL removal request. Now just waiting things to go back to normality.
Meantime I was supposed to move my website to HTTPS this week.
Question:
Should I wait until this indexing error has been fixed or I may as well go ahead with the SSL move?
-
Let me know if you can let me see the domain?
-
PS I agree it would be nice to hear from more people I am sure you will by tomorrow.
-
I respect your want to get this right
“A few days ago, Google indexed hundreds of my directories by mistake (error with plugins/host); my traffic dropped as a consequence.
_ Anyway, I fixed that and submitted a URL removal request. Now just waiting for things to go back to normality.” _
in my opinion, you can see the change happen to the https URLs & you want to be sure that https will not let your unwanted directory back in Google's index.
I think unless you can show me your site better Search & replace will be more comfortable & faster.
Yes, I have used better search & replace many times; it’s a great tool.
Make sure you back up your site before you do a search & replace.
https://wordpress.org/plugins/better-search-replace/
I think you should be able to see everything needed and more from this great how-to the person who made it is a friend & they don't just show you how to do this on Kinsta they show you how to use Apache & Nginx as well.
It covers everything you might want to think about managed wordpress hosting if this is something you don't feel comfortable doing.
https://kinsta.com/blog/http-to-https/
I would look at Pagely, Pantheon Servebolt & Kinsta.
If it helps, I am ranked in the top 10 or 11 for assisting people on Moz. Unless you only blocked the HTTP version of your directories, then you will have the same results with https.
- Would you allow me to see the domain?
- Who is your hosting company?
Respectfully,
Tom Zickell
-
Hi Tom,
yes I have WordPress, I have read a few guides online about it.for the search and replace part, I found a plugin called Better Search Replace: https://www.wpbeginner.com/wp-tutorials/how-to-add-ssl-and-https-in-wordpress/
the instructions in the search engine journal article do the search-replace manually. The plugin should be better, have you tried it?
Good question. In all honesty, I think it's safe to go forward with an HTTPS migration.
I am a bit worried. I asked this question to many people in many forums and nobody replies. Which makes me wonder if it's something really bad. It would be nice to have additional opinions.
-
Good question. In all honesty, I think it's safe to go forward with an HTTPS migration.
A "URL removal request"
A simple move to https two different things that can be done at the same time.
if you have a certain set up please let me know I can give you better instructions on how to completely migrate over to HTTPS. Here are some of the basics below.
- https://www.searchenginejournal.com/https-migration-guide/195103/
- https://www.keycdn.com/blog/http-to-https
- https://gofishdigital.com/steps-in-website-https-migration/
if you're using WordPress
- https://www.searchenginejournal.com/wordpress-http-to-https/236969/
- Photograph https://imgur.com/q1RfhhH.jpg
Search & Replace in Files
To begin, search for instances of your domain pointing to HTTP URLs of your site.
Use the regex for **“www” **and without “www” URL cases and search for
http:\/\/(www\.)?yourdomain\.com
always search and replace your database if you have one.
remember to implement redirects
NGINX
Add the following to your Nginx config.
<code class=" language-nginx">server { listen 80; server_name domain.com www.domain.com; return 301 https://domain.com$request_uri; }</code>
Apache
Add the following to your
.htaccess
file.<code class=" language-apacheconf">RewriteEngine On RewriteCond %{HTTPS} off RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]</code>
6. Update your robots.txt File#
Update any hard-coded links or blocking rules you might have in your
robots.txt
that might still be pointing to HTTP directories or files- Update your robots.txt file
- Update your disavow file if you have one
- update Google search console
if you want a very quick easy way to implement HTTPS, redirects as well as certificates I recommend Cloudflare The free version should do it.
-
I hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirect question
Hi Everyone When doing 301 redirects for a large site, if a page has 0 inbound links would you still redirect it or just leave it? Im just curious on the best practice for this Thanks in advance
Technical SEO | | TheZenAgency0 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
Technical SEO | | RichardPicard0 -
Schema Address Question
I have a local business with a contact page that I want to add schema markup to. However, I was wondering if having the address with schema info on the contact page instead of the home page has any adverse effects on the rich snippet showing up in search. There's no logical place to add schema for a local business on the home page, so having it on the contact page—not in the footer or sidebar—is the only option.
Technical SEO | | DLaw0 -
Mod rewrite question
Sorry in advance if this isn't the best place to ask this question. Google Webmaster Tools has recently identified a ton of "Not Found" pages, which are actual pages with some digits appended at the end. For example, suppose an actual page on my blog is: (A) http://www.example.com/blog/2012/09/my-post-title/ This page works just fine. However, GWT has identified the following page as a "not found" page: (B) http://www.example.com/blog/2012/09/my-post-title/9157586677/1846732913010 This appears to be happening to hundreds of posts on my site. In each case, the "9157586677" portion of the URL is identical, but the remaining 13 digits change from page to page. I haven't been able to determine exactly what is causing this to happen - it's probably a social plug-in for Wordpress, or perhaps Disqus, but I'm not sure which one. I'll go through a process of elimination to narrow it down over the coming week. As a quick fix, I'd like to create a ModRewrite rule so that requests for (B) get 301 redirected to (A). Since there are hundreds of posts, I need to do this in a way that works regardless of what's in the "/2012/09/my-post-title/" part of the URL. Unfortunately, mod-rewrite is outside of my area of expertise. Can somebody please suggest how I can handle this? Thanks in advance. PS - As for tracking down the cause, I've looked at the source of the pages in the "Linked From" area of GWT and the Not Found link is nowhere to be found. That is why I assume the bad link is being generated by some javascript that is a part of one of my plug-ins. Update: It seems like Disqus is the source of these phantom links. There's considerable discussion here. I'll continue searching for a long-term solution. Meanwhile, I'd still appreciate help with the mod-rewrite question above. Thanks again.
Technical SEO | | ahirai0 -
Question about Robot.txt
I just started my own e-commerce website and I hosted it to one of the popular e-commerce platform Pinnacle Cart. It has a lot of functions like, page sorting, mobile website, etc. After adjusting the URL parameters in Google webmaster last 3 weeks ago, I still get the same duplicate errors on meta titles and descriptions based from Google Crawl and SEOMOZ crawl. I am not sure if I made a mistake of choosing pinnacle cart because it is not that flexible in terms of editing the core website pages. There is now way to adjust the canonical, to insert robot.txt on every pages etc. however it has a function to submit just one page of robot.txt. and edit the .htcaccess. The website pages is in PHP format. For example this URL: www.mycompany.com has a duplicate title and description with www.mycompany.com/site-map.html (there is no way of editing the title and description of my sitemap) Another error is www.mycompany.com has a duplicate title and description with http://www.mycompany.com/brands?url=brands Is it possible to exclude those website with "url=" and my "sitemap.html" in the robot.txt? or the URL parameters from Google is enough and it just takes a lot of time. Can somebody help me on the format of Robot.txt. Please? thanks
Technical SEO | | paumer800 -
Getting querystring indexed?
Hi everybody! I work with tags a lot on my photo blog but I haven't gotten Google to index one tag so far. Any tips on how to do this? Thanks / Niklas
Technical SEO | | KAN-Malmo0 -
URL Structure Question
Hey folks, I have a weird problem and currently no idea how to fix it. We have a lot of pages showing up as duplicates although they are the same page, the only difference is the url structure. They seem to show up like: http://www.example.com/page/ and http://www.example.com/page What would I need to do to force the URLs into one format or the other to avoid having that one page counting as two? The same issue pops up with upper and lower case: http://www.example.com/Page and http://www.example.com/page Is there any solution to this or would I need to forward them with 301s or similar? Thanks, Mike
Technical SEO | | Malarowski0