Correct Indexing problem
-
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
-
Okay I may have understood your original post differently then what you meant.
So the case is you have HTTPS enabled, but Google is Indexing Both HTTP & HTTPS pages. However, you want them to only index the HTTP version. You are also running a cart or checkout which is only HTTPS which is likely not relevant to Google so I would recommend blocking those pages with robots.txt.
I would recommend coding an IF statement to deal with duplicate indexing (https & http) & setting up a robots.txt file to prevent crawling pages that have no value and are there for customer use only.
Something like this would work in php:
_
_if ( isset($_SERVER['HTTPS']) || (isset($SERVER['HTTPS']) && strtolower($SERVER['HTTPS'])) == 'on' ) {echo ''."\n";}
else {echo ''."\n";}
?>_I'm not sure the code in asp since I rarely ever use Windows servers but you should be able to find that with Google.
Then setup your robots.txt to block all urls that are specific to personal data like this: (Example)
Disallow: /catalog/account.php
Disallow: /catalog/account_edit.php
Disallow: /catalog/account_history.php
Disallow: /catalog/account_history_info.php
Disallow: /catalog/account_password.php
Disallow: /catalog/add_checkout_success.php
Disallow: /catalog/address_book.php
Disallow: /catalog/address_book_process.php
Disallow: /catalog/checkout_confirmation.php
Disallow: /catalog/checkout_payment.php
Disallow: /catalog/checkout_process.php
Disallow: /catalog/checkout_shipping.php
Disallow: /catalog/checkout_shipping_address.php
Disallow: /catalog/checkout_success.php
Disallow: /catalog/cookie_usage.php
Disallow: /catalog/create_account.phpI hope that helps
Don_
-
My site should be running http on all pages except the checkout. Would this work the opposite of what you have written and I can make a rule for the checkout to allow https?
Thanks
jared
-
If your site is running on https only, then a simple edit to your .htaccess file will correctly re-direct (301) any request for a http page to the correct https page.
Sample Code:
RewriteCond %{HTTPS} !=on
RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]
There are several ways to handle this, so you may also benefit from Searching ".htaccess 301 redirect http to https"
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Why are my images not being indexed?
I have submitted an image sitemap with over 2,000 images yet only about 35 have been indexed. Could you please help me understand why Google is not indexing my images? www.creative-calendars.com
Technical SEO | | nicole20140 -
Is dash problem for seo?
My web site http://www.green-lotus-trekking.com is this problem for google search engine optimization? Some little percentage problem or totally I am in Confusion?
Technical SEO | | agsln0 -
Site redesign. Possible SEO problems?
Hi! Our website has enjoyed good rankings for lots of our keywords for the past 9years. Over the years the site became heavy, so we want to change the design radically. Content, filenames, titles, urls etc. will all remain identical just the change of a template. The new one is minimalistic and responsive design. We are worrying this may drop our domain or page authority and kill all our past SEO efforts. **How to do this with the least harm to our rankings? Anything we need to avoid?**Articles/advice/suggestions comments on design and SEO would be greatly appreciated. Thank you for your help! Alina
Technical SEO | | skalfa0 -
Should We Index These Category Pages?
Currently we have marked category pages like http://www.yournextshoes.com/celebrities/kim-kardashian/ as follow/noindex as they essentially do not include any original content. On the other hand, for someone searching for Kim Kardashian shoes, it's a highly relevant page as we provide links to all the Kim Kardashian shoe sightings that we have covered. Should we index the category pages or leave them unindexed?
Technical SEO | | Jantaro0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Getting a citation page indexed
Howdy mozzers, I have a citation on a .govt domain with 2 links pointing to my site. The page is not indexed by Google, bing or yahoo. URL; http://www.familyservices.govt.nz/directory/viewprovider.htm?id=17077 I have tried getting the paged indexed by building bookmark links to it. I have tweeted the url and gotten a few re-tweets for it. But no luck. The page has got no nofollow meta tag. Other listings have been indexed by google. Could someone please advise on means to help me get the page indexed? A strategy that I have not yet tried is submitting a sitemap that includes the external url as I am not sure if it is possible to include url's not part of my domain. Any advice, help would be greatly appreciated. viva le SEOmoz Thanks
Technical SEO | | ihms1