Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
-
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented.
If this is hurting us, what would you recommend as a solution?
-
Hi,
This won't give you any issues as long as there is not two different visions of the same page e.g https and http. Some sites have different parts of their site in https and other parts in http. You just need to see if there is two different visions showing up. If there is than you need to look at cleaning this up. I won't lie it is hard doing this in bigcommerce because it's a closed system not like wordpress or another open source system.
In saying that I personally like having it one way or the other but to go back to your question. It is fine as long as there is only one vision.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Is new created page's pagerank 1 ?
Hey I just want to know,
Technical SEO | | atakala
If I create a web page, is the pagerank of the page would be 1?1 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site. I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
Technical SEO | | Sayers1 -
Should I change by URL's
I started with a static website and then moved to Wordpress. At the time I had a few hundred pages and wanted to keep the same URL structure so I use a plugin that adds .html to every page. Should I change the structure to a more common URL structure and do 301 directs from the .html page to the regular page?
Technical SEO | | JillB20130 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Does a CMS inhibit a site's crawlability?
I smell baloney but I could use a little backup from the community! My client was recently told by an SEO that search engines have a hard time getting to their site because using a CMS (like WordPress) doesn't allow "direct access to the html". Here is what they emailed my client: "Word Press (like your site is built with) and other similar “do it yourself” web builder programs and websites are not good for search engine optimization since they do not allow direct access to the HTML. Direct HTML access is needed to input important items to enhance your websites search engine visibility, performance and creditability in order to gain higher search engine rankings." Bots are blind to CMSs and html is html, correct? What do you think about the information given by the other SEO?
Technical SEO | | Adpearance0 -
Duplicate content and URL's
Hi Guys, Hope you are all well. Just a quick question which you will find nice and easy 🙂 I am just about to work through duplicate content pages and URL changes. Firstly, With the duplicate content issue i am finding the seo friendly URL i would normally direct to in some cases has less links, authority and root domain to it than some of the unseo friendly URL's. will this harm me if i still 301 redirect them to the seo friendly URL. Also, With the url changed it is going to be a huge job to change all the url so they are friendly and the CMS system is poor. Is there a better way of doing this? It has been suggested that we create a new webpage with a friendly URL and redirect all the pages to that. Will this lose all the weight as it will be a brand new page? Thank you for your help guys your legends!! Cheers Wayne
Technical SEO | | wazza19850