Https indexed...how?
-
Hello Moz,
Since a while i am struggling with a SEO case:
At the moment a https version of a homepage of a client of us is indexed in Google. Thats really strange because the url is redirected to an other website url for three weeks now. And we did everything to make clear to google that he has to index the other url.
So we have a few homepage urlsA https://www.website.nl
B https://www.websites.nl/category
C http://www.websites.nl/categoryWhat we did:
- Redirected A with a 301 to B, a redirect from A or B to C is difficult because of the security issue with the ssl certificate.
- We put the right canonical url (VERSION C) on every version of the homepage(A,B)
- We only put the canonical urls in the sitemap.xml, only version C and uploaded it to Google Webmastertools
- We changed all important internal links to Version C
- We also get some valuable external backlinks to Version C
Is there something i missed or i forget to say to Google hey look you've got the wrong url indexed, you have to index version C?
How is it possible Google still prefers Version A after doing al those changes three weeks a go?
I'am really looking forward to your answer.
Thanks a lot in advanced!
Greetz
Djacko
-
Sorry for my late respons.
We have decided to go over on https for the whole website that was the best thing to do. The results are good and Google is also happy because there is just one kind of a url we are using within the whole website.
We prefer the short url version and that worktout good.
Greetings
Jacco
-
Hey Djacko: Has Google figured it out? Is the right version of your page indexed now?
-
Hi Rickus,
Thanks for your answer. I agree its still a little bit confusing for Google wich url is the homepage url. We also discovered they have normal links pointing to https urls. So the intern linkstructure is not so good as well.
We will wait a little longer for Google to react on the 301 redirect. We were thinking if he still prefer the https version A, to change all urls in https urls and make an new sitemap with https urls as canonicals and redirect all http urls to https versions. I think Google prefer more often https versions of websites.
Whats your opinion about that?
Thanks a lot!
Djacko
-
Hi there Djacko,
You redirected A to B but could not redirect A or B to C, So that means google is still not to sure of which one is the correct url. And switching from https to http is telling google you have a new domain.
So don't stress to much just make sure that your http://www.homepage.nl and the https://www.homepage.nl has the same work and SEO done on them and give google time to realise whats happend to you site.
In personal experience a 301 redirect always takes the longer to be crawled and updated and in this case its your homepage (domain).
Hope this helps and good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only fraction of the AMP pages are indexed
Back in June, we had seen a sharp drop in traffic on our website. We initially assumed that it was due to the Core Update that was rolled out in early June. We had switched from http to https in May, but thought that should have helped rather than cause a problem. Until early June the traffic was trending upwards. While investigating the issue, I noticed that only a fraction (25%) of the AMP pages have been indexed. The pages don't seem to be getting indexed even though they are valid. Accordingly to Google Analytics too, the percentage of AMP traffic has dropped from 67-70% to 40-45%. I wonder if it is due to the indexing issue. In terms of implementation it seems fine. We are pointing canonical to the AMP page from the desktop version and to the desktop version from the AMP page. Any tips on how to fix the AMP indexing issue. Should I be concerned that only a fraction of the AMP pages are indexed. I really hope you can help in resolving this issue.
Technical SEO | | Gautam1 -
How to index Backlink Fast
hi, From the past some month i am facing the problem in indexing backlinks, please share the method to index backlink in google fast
Technical SEO | | vijay231 -
Any idea why pages are not being indexed?
Hi Everyone, One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one: https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console? Jeff
Technical SEO | | vetofunk1 -
How to check if an individual page is indexed by Google?
So my understanding is that you can use site: [page url without http] to check if a page is indexed by Google, is this 100% reliable though? Just recently Ive worked on a few pages that have not shown up when Ive checked them using site: but they do show up when using info: and also show their cached versions, also the rest of the site and pages above it (the url I was checking was quite deep) are indexed just fine. What does this mean? thank you p.s I do not have WMT or GA access for these sites
Technical SEO | | linklander0 -
Removing a staging area/dev area thats been indexed via GWT (since wasnt hidden) from the index
Hi, If you set up a brand new GWT account for a subdomain, where the dev area is located (separate from the main GWT account for the main live site) and remove all pages via the remove tool (by leaving the page field blank) will this definately not risk hurting/removing the main site (since the new subdomain specific gwt account doesn't apply to the main site in any way) ?? I have a new client who's dev area has been indexed, dev team has now prevented crawling of this subdomain but the 'the stable door was shut after the horse had already bolted' and the subdomains pages are on G's index so we need to remove the entire subdomain development area asap. So we are going to do this via the remove tool in a subdomain specific new gwt account, but I just want to triple check this wont accidentally get main site removed too ?? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0