Https indexed...how?
-
Hello Moz,
Since a while i am struggling with a SEO case:
At the moment a https version of a homepage of a client of us is indexed in Google. Thats really strange because the url is redirected to an other website url for three weeks now. And we did everything to make clear to google that he has to index the other url.
So we have a few homepage urlsA https://www.website.nl
B https://www.websites.nl/category
C http://www.websites.nl/categoryWhat we did:
- Redirected A with a 301 to B, a redirect from A or B to C is difficult because of the security issue with the ssl certificate.
- We put the right canonical url (VERSION C) on every version of the homepage(A,B)
- We only put the canonical urls in the sitemap.xml, only version C and uploaded it to Google Webmastertools
- We changed all important internal links to Version C
- We also get some valuable external backlinks to Version C
Is there something i missed or i forget to say to Google hey look you've got the wrong url indexed, you have to index version C?
How is it possible Google still prefers Version A after doing al those changes three weeks a go?
I'am really looking forward to your answer.
Thanks a lot in advanced!
Greetz
Djacko
-
Sorry for my late respons.
We have decided to go over on https for the whole website that was the best thing to do. The results are good and Google is also happy because there is just one kind of a url we are using within the whole website.
We prefer the short url version and that worktout good.
Greetings
Jacco
-
Hey Djacko: Has Google figured it out? Is the right version of your page indexed now?
-
Hi Rickus,
Thanks for your answer. I agree its still a little bit confusing for Google wich url is the homepage url. We also discovered they have normal links pointing to https urls. So the intern linkstructure is not so good as well.
We will wait a little longer for Google to react on the 301 redirect. We were thinking if he still prefer the https version A, to change all urls in https urls and make an new sitemap with https urls as canonicals and redirect all http urls to https versions. I think Google prefer more often https versions of websites.
Whats your opinion about that?
Thanks a lot!
Djacko
-
Hi there Djacko,
You redirected A to B but could not redirect A or B to C, So that means google is still not to sure of which one is the correct url. And switching from https to http is telling google you have a new domain.
So don't stress to much just make sure that your http://www.homepage.nl and the https://www.homepage.nl has the same work and SEO done on them and give google time to realise whats happend to you site.
In personal experience a 301 redirect always takes the longer to be crawled and updated and in this case its your homepage (domain).
Hope this helps and good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
HTTPS for form pages?
I am creating a small business website for a friend in Recruitment. It’s very small and mainly just a shop window for the business. There’s no login area for the website, but there are two areas were users can enter information: General contact us form (giving email and phone number) Applying for a job (attaching a resume) The forms are using Ninja Forms – which I believe are secure in passing information. But am I missing anything? Do I need to make these pages https at all? I’m quite new to building sites from scratch. Thanks for your help
Technical SEO | | joberts0 -
Issues with getting a web page indexed
Hello friends, I am finding it difficult to get the following page indexed on search: http://www.niyati.sg/mobile-app-cost.htm It was uploaded over two weeks back. For indexing and trouble shooting, we have already done the following activities: The page is hyperlinked from the site's inner pages and few external websites and Google+ Submitted to Google (through the Submit URL option) Used the 'Fetch and Render' and 'Submit to index' options on Search Console (WMT) Added the URL on both HTML and XML Sitemaps Checked for any crawl errors or Google penalty (page and site level) on Search Console Checked Meta tags, Robots.txt and .htaccess files for any blocking Any idea what may have gone wrong? Thanks in advance!
Technical SEO | | RameshNair
Ramesh Nair0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Carwling and indexing problems
hi, i have noticed since my site was upgraded that google is taking a long time to publish my articles. before the upgrade google would publish the article straight away, but now it takes an average of around 4 days. the article i am talking about at the moment is here http://www.in2town.co.uk/celebrities-in-the-news/stuart-hall-has-his-prison-sentence-for-sex-crimes-doubled-to-30-months now i have a blog here on blogger and the article was picked up within six mins http://showbizgossipandnews.blogspot.co.uk/2013/07/stuart-hall-has-his-prison-sentence-for.html so i am just wondering what the problem is and what i need to solve this my problem is, my site is mostly a news site so it is no good to me if google is publishing new stories every four days, any help would be great.
Technical SEO | | ClaireH-1848860 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Blocking https from being crawled
I have an ecommerce site where https is being crawled for some pages. Wondering if the below solution will fix the issue www.example.com will be my domain In the nav there is a login page www.example.com/login which is redirecting to the https://www.example.com/login If I just disallowed /login in the robots file wouldn't it not follow the redirect and index that stuff? The redirect part is what I am questioning.
Technical SEO | | Sean_Dawes0 -
Indexing of flash files
When Google indexes a flash file, do they use a library for such a purpose ? What set me thinking was this blog post ( although old ) which states - "we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1."
Technical SEO | | seoug_20050