Http:// vs Https:// in Og:URL
-
Hi,
Recently, we have migrated our website from http:// to https://. Now, every URL is in https:// and we have used 301 permanent redirection for redirecting OLD URL's to New Ones.
We have planned to include http:// link in og:url instead of https:// due to some social share issues we are facing. My concern is, if Google finds the self http:// URL on every page of my blog, will Google gets confused with http and https:// as we are providing the old URL to Google for crawling.
Please advice.
Thanks
-
Shareaholic and Social Warfare (paid) plugins are the ones I'm familiar with, Sameer. There is also a paid add-on for Sassy Social Share plugin that will accomplish this too. Others may have also added this capability.
P.
-
Thanks for sharing the informative answers. @ThompsonPaul Which Wordpress Plugin we can use to save the Facebook share count?
-
Yea, this is a stupid screwup by Facebook because they won't fix their system to simply recognise the HTTP and HTTPS URLs are the same. (Others like Google+ got this figured out ages ago.)
Mentioning the HTTP URL for your OG data won't do any harm. It's the same as all the other websites out there with previous links still pointing to your old HTTP address. The 301-redirect and the hew HTTPS sitemap will give crawlers overwhelming directive of the correct URLs to index. (And in fact, the OG URL isn't even technically a link, so crawlers likely aren't even following it.)
As Martijn says though - you're kind of in limbo. The pages will show the old counts, but won't be aggregating the new counts. There are some plugins for WordPress that purport to be able to combine both, but I haven't used one yet.
Paul
-
What I'm guessing is that Sameer has a significant amount of URLs that have a ton of shares. So some numbers around social share counts might be off due to that. Anyway, what you could evaluate is if you can change the OG urls for only new articles. In the end HTTPS is also the new reality for you, so you might just best suck it up and change it to HTTPS. Likely the impact will be low.
-
I think you probably know what I am going to say!
If you put the HTTP version of the URL you are sending out conflicting signs to Google by telling them that both pages exist. By sharing the HTTP links on Socials you are also creating needles redirects. There is conjecture, still, as to whether 301 causes any loss of link juice, but personally I'm with Rand Fishkin, that there will be a small loss of link juice suffered by having a redirect in place (Others - this is not the place to argue about this!), so I would make it as clear as possible without any HTTP mentions on the page whatsoever.
I'm not sure what you mean about 'social share issues' but maybe I can help? what are they?
Any SEO audit software will tell you the same - if you move to HTTPS then use the correct URL otherwise the software will scream with warnings.
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Http to https redirection issue
Hi, i have a website with http but now i moved to https. when i apply 301 redirection from http to https & check in semrush it shows unable to connect with https & similar other tool shows & when i remove redirection all other tools working fine but my https version doesn't get indexed in google. can anybosy help what could be the issue?
Technical SEO | | dhananjay.kumar10 -
Switching to HTTPS
Hey Moz Community! I am about to switch my website from http to https. I am wondering if I need to create a 301 redirect of every single page on my site, from the http address to the https url? Or if setting one master that redirects all traffic to the https version. Obviously I am concerned about losing rankings by switching. Code for https redirect of all RewriteEngine On
Technical SEO | | SeanConroy
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{SERVER_NAME}/$1 [R,L]1 -
Does Google add parameters to the URL parameters in webmaster tools/
I am seeing new parameters added (and sometimes removed) from the URL Parameter tool. Is there anything that would add parameters to the tool? Or does it have to be someone internally? FYI - They always have no date in the configured column, no effect set, and crawl is set to Let Google decide.
Technical SEO | | merch_zzounds0 -
Some URLs were not accessible to Googlebot due to an HTTP status error.
Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.
Technical SEO | | GoldenRanking140 -
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Updating content on URL or new URL
High Mozzers, We are an event organisation. Every year we produce like 350 events. All the events are on our website. A lot of these events are held every year. So i have an URL like www.domainname.nl/eventname So what would you do. This URL has some inbound links, some social mentions and so on. SO if the event will be held again in 2013. Would it be better to update the content on this URL or create a new one. I would keep this URL and update it because of the linkvalue and it is allready indexed and ranking for the desired keyword for that event. Cheers, Ruud
Technical SEO | | RuudHeijnen0