Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
-
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented.
If this is hurting us, what would you recommend as a solution?
-
Hi,
This won't give you any issues as long as there is not two different visions of the same page e.g https and http. Some sites have different parts of their site in https and other parts in http. You just need to see if there is two different visions showing up. If there is than you need to look at cleaning this up. I won't lie it is hard doing this in bigcommerce because it's a closed system not like wordpress or another open source system.
In saying that I personally like having it one way or the other but to go back to your question. It is fine as long as there is only one vision.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404's being re-indexed
Hi All, We are experiencing issues with pages that have been 404'd being indexed. Originally, these were /wp-content/ index pages, that were included in Google's index. Once I realized this, I added in a directive into our htaccess to 404 all of these pages - as there were hundreds. I tried to let Google crawl and remove these pages naturally but after a few months I used the URL removal tool to remove them manually. However, Google seems to be continually re/indexing these pages, even after they have been manually requested for removal in search console. Do you have suggestions? They all respond to 404's. Thanks
Technical SEO | | Tom3_151 -
My 'complete guide' is cannibalising my main product page and hurting rankings
Hi everyone, I have a main page for my blepharoplasty surgical product that I want to rank. It's a pretty in-depth summary for patients to read all about the treatment and look at before and after pictures and there's calls to action in there. It works great and is getting lots of conversions. But I also have a 'complete guide' PDF which is for patients who are really interested in discovering all the technicalities of their eye-lift procedure including medical research, clinical stuff and risks. Now my main page is at position 4 and the complete guide is right below it in 5. So I tried to consolidate by adding the complete guide as a download on the main page. I've looked into rel canonical but don't think it's appropriate here as they are not technically 'duplicates' because they serve different purposes. Then I thought of adding a meta noindex but was not sure whether this was the right thing to do either. My report doesn't get any clicks from the serps, people visit it from the main page. I saw in Wordpress that there's options for the link, one says 'link to media file', 'custom URL' and 'attachment'. I've got the custom URL selected at the moment. There's also a box for 'link rel' which i figure is where I'd put the noindex. If that's the right thing to do, what should go in that box? Thanks.
Technical SEO | | Smileworks_Liverpool0 -
My website's pages are not being indexed correctly
Hi, One of our websites, which is actually a price comparison engine, facing indexing problem at Google. When we check “site:mywebsite.com “, there are lots of pages indexed which are not from mywebsite.com but from merchants websites. The index result page also shows merchant’s page title. In some cases the title is from merchant’s site but when the given link is accessed it points to mywebsite.com/index. Also the cache displays the merchant’s product page as the last indexed version rather than showing ours. The mywebsite.com has quite few Merchants that send us their product feed. Those products are listed on comparison page with prices. The merchant’s links on comparison page are all no-follow links but some of the (not all) merchant’s product pages are indexed against mywebsite.com as mentioned above instead of product comparison page of mywebsite.com How can we fix the issue? Thanks!
Technical SEO | | digitalMSB0 -
What is the best way to handle these duplicate page content errors?
MOZ reports these as duplicate page content errors and I'm not sure the best way to handle it. Home
Technical SEO | | ElykInnovation
http://myhjhome.com/
http://myhjhome.com/index.php Blog
http://myhjhome.com/blog/
http://myhjhome.com/blog/?author=1 Should I just create 301 redirects for these? 301 http://myhjhome.com/index.php to http://myhjhome.com/ ? 301 http://myhjhome.com/blog/?author=1 to http://myhjhome.com/ ? Or is there a better way to handle this type of duplicate page content errors? and0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
Has Google stopped rendering author snippets on SERP pages if the author's G+ page is not actively updated?
Working with a site that has multiple authors and author microformat enabled. The image is rendering for some authors on SERP page and not for others. Difference seems to be having an updated G+ page and not having a constantly updating G+ page. any thoughts?
Technical SEO | | irvingw0 -
Why won't the Moz plug in "Analyze Page" tool read data on a Big Commerce site?
We love our new Big Commerce site, just curious as to what the hang up is.
Technical SEO | | spalmer0 -
How do https pages affect indexing?
Our site involves e-commerce transactions that we want users to be able to complete via javascript popup/overlay boxes. in order to make the credit card form secure, we need the referring page to be secure, so we are considering making the entire site secure so all of our site links wiould be https. (PayPal works this way.) Do you think this will negatively impact whether Google and other search engines are able to index our pages?
Technical SEO | | seozeelot0