Question #1: Does Google index https:// pages? I thought they didn't because....
-
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored)
My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one.
The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/
instead of
**http://**www.example.com/example-page/
To double check that this was causing a loss in Link Juice. I jumped over to OSE.
Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed.
So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed...
Right??
Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed.
The problem is.. is this a volusion problem?
Should I switch to Wordpress?
here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress)
-
Hi Tyler
Looks like the duplicate title tags are largely from empty pages like these:
http://www.uncommonthread.com/008-Pink-Ice-p/14410008.htm
http://www.uncommonthread.com/001-Gold-p/14410001.htm
http://www.uncommonthread.com/019-Copper-p/14410019.htm
http://www.uncommonthread.com/027-Electric-Blue-p/14410027.htm
Even though these pages are somewhat unique, the content is definitely "thin" and having a lot of pages like this typically isn't good for rankings.
Ideally, you would list small product variations on the same page, or even have several similar product pages canonical to a master page. Generally if you don't have a 200 words minimum of good editorial content, Google might consider it duplicate.
I don't see any reason why switching to http should cause too much problem if you passed everything through a 301 redirect. To be honest, it's typical for rankings to fluxuate frequently so it could be a million things.
If I look at the text-only cache of the page you sent: http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com&strip=1
... it looks pretty similar. If it were my page, I'd probably try to include more descriptive text on the page, richer descriptions, ets.
Hope this helps!
-
Wow. What an awesome answer.
I literally don't know if I can't thank you enough for taking the time to answer so wholesomely.
I decided to go ahead and fix the https:// and change it to http://
Weird results here.. Traffic went down by 5.5% compared to the month before I posted this thread.
I noticed an increase in duplicate title tags (about 700 - 1000 of them) in my seomoz account.
Could that be likely to be the reason for the decrease? Or is it just because I shouldn't have made such a drastic site-wide change like that?
I am attempting to give unique title tags, and html titles to all of those product pages that are causing the increases in duplicate titles
I also am in a slight predicament because she hired another company to do some "optimization" around October 23rd.
Since then, they have made some spammy changes in my opinion, but some results have shown up (20+% increase starting around Jan 1st, and capping on the day I made the https:// change), and I can't get her to agree with me that we should invest in building a Social following, making better content, and blogging more often, etc. I also think we should move the blog to be in a sub folder on the domain as well..
I compared the webcache you showed me to a wordpress site that i built and the difference really was pretty shocking
http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com
whats the difference as far as rankings and SEs are concerned?
-
Hi Tyler,
Great question! In fact, it's a common misconception that Google doesn't index https. In truth, these days they appear to index most https just fine.
If we do a site operator Google search for https on your site, we get something like this:
site:uncommonthread.com/ inurl:https (click to see results)
This returns 165 URLs on your site with the https protocol.
But.... these URLs don't show up in OSE because at this time, the Linkscape crawler can't crawl https. When it was originally built Google still didn't index https, so it wasn't needed. This should be fixed in just a few months and you should start seeing those https results in there. The good news is that OSE is completely separate from Google and doesn't influence your rankings in any way.
Now for the bad news....
Whenever you have https, you want to make sure you only have ONE version of the url, so that https either redirects (via 301) to the http version, or vice versa. Otherwise Google might index both versions. For example, both of these URLs resolve on your site:
https://www.uncommonthread.com/kb_results.asp?ID=5
http://www.uncommonthread.com/kb_results.asp?ID=5
The solution is to either 301 redirect one to the other, or have an absolute canonical tag on both pages that points to one or the other (an absolute canonical means it contains the full URL, including http or https)
That said, I don't see any evidence that Google has indexed both URL versions of your site (at least not like Dunkin Donuts
Should You Switch to Wordpress?
Based simply on the https issue, switching to Wordpress isn't necessary. But Wordpress does offer other advantages, and is generally a very SEO friendly platform.
There might be other considerations you may consider to switch away from your current CMS.
For example, consider Google's Text-only cache of your homepage: http://webcache.googleusercontent.com/search?q=cache:http://www.uncommonthread.com/default.asp&strip=1
See how barren it is? Without taking a deep dive, it's possible the structure and technology employed by your CMS is causing indexing/crawling issues, and considerable technical effort may be required to make it SEO friendly. I can't give you a definite answer either way, but it's something to think about.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having a Subfolder/Subdirectory With a Different Design Than the Root Domain
Hi Everyone, I was wondering what Google thinks about having a subfolder/subdirectory with a different design than the root domain. So let's say we have MacroCorp Inc. which has been around for decades. MacroCorp has tens of thousands of backlinks and a couple thousand referring domains from quality sites in its industry and news sites. MacroCorp Inc. spins off one of its products into a new company called MicroCorp Inc., which makes CoolProduct. The new website for this company is CoolProduct.MacroCorp.com (a subdomain) which has very few backlinks and referring domains. To help MicroCorp rank better, both companies agree to place the MicroCorp content at MacroCorp.com/CoolProduct/. The root domain (MacroCorp.com) links to the subfolder from its navigation and MicroCorp does the same, but the MacroCorp.com/CoolProduct/ subfolder has an entirely different design than the root domain. Will MacroCorp.com/CoolProduct/ be crawled, indexed, and rank better as both companies think it would? Or would Google still treat the subfolder like a subdomain or even a separate root domain in this case? Are there any studies, documentation, or links to good or bad examples of this practice? When LinkedIn purchased Lynda.com, for instance, what if they kept the https://www.lynda.com/ design as is and placed it at https://www.linkedin.com/learning/. Would the pre-purchase (yellow/black design) https://www.linkedin.com/learning/ rank any worse than it does now with the root domain (LinkedIn) aligned design? Thanks! Andy
Web Design | | AndyRCWRCM1 -
19 Hours Excessive to Code Single Wordpress Page?
My developer says that is will take 19 hours to modify a listing page of the wpcasa London real estate theme because the existing template is difficult to customize. I am attaching an image of the existing page before customization and an image of a final mock up. Is 19 hours a reasonable amount of time to customize this page? Look forward to feedback. New Design is visible at: https://imgur.com/a/42XBqDD Alan IQ1i0kg
Web Design | | Kingalan10 -
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Do we still have this Page Rank / Link juice / Link equity? So this dilution concept?
Hi all, As per the traditional or standard SEO rules, we have this link juice and dilution concept. Many websites have changed their linking structure with this with the beleif "the more number of pages, the PR will get diluted". Then many websites avoided more number of pages from homepage to avoid link juice dilution. Even we followed same. But I just wonder it's still the same way Google handles websites and rankings as per the links. And many websites even avoid more number of 2nd tier/hierarchy pages to avoid link dilution. I have gone through our competitors where they been employing lot of top level pages like 2nd tier/hierarchy pages but still doing good at rankings. Please share your views and suggestions on this. Thanks
Web Design | | vtmoz0 -
Referring subdirectory pages from 3rd hierarchy level pages. Will this hurts?
Hi all, We have product feature pages at 3rd tier like website.com/product/features. We have the help guides for each of these features on a different subdirectory like website.com/help/guides. We are linking these help guides from every page of features. So, will it hurts us anywhere just because we are encouraging 4th tier pages in website, moreover they are from different sub-directory. Thanks
Web Design | | vtmoz0 -
Pushstate and Infinite Scrolling Article Pages: Is it detrimental to not change URLs as the page is being scrolled?
I've noticed a recent trend of news sites using infinite scrolling on article pages to garner more pageviews and I can assume serve up more ads. Here is an overview. Here is an article from NBC news that uses this technique: http://www.nbcnews.com/pop-culture/music/grammys-2016-here-s-why-adele-s-performance-was-out-n519186 Studies have shown that this technique has decreased bounce rates by +15% for some sites. My question is: If a site is using the technique without changing URLs as the user scrolls down what overall negative effects does this have? Obviously you wouldn't be getting credit for the extra pageviews but I was wondering if there were any indexation implications with this. Here is an example of article infinite scrolling without changing the URL: http://www.wftv.com/news/national-content/deputies-wife-attacks-husband-because-he-didnt-get-her-a-valentines-day-gift/87691927
Web Design | | Cox-Media-Group1 -
Responsive image plugins and seo / crawlability
Note : For the background of this question please read the preface below. Ive been researching responsive image options the main issue i can see with them is that they are not semantic html so bots may not index them correctly. For instance many of the responsive image plugins use data-src for an image rather than src. Does any one have any experience with this and if it impacts on SEO ? Does any one know of a client side responsive image soltion that uses a normal img tag with the image stored in the src and with the option to set an alt attribute ? **Preface : ** Ive got a site we are currently developing, the site has a large full width responsive image slider. To serve images that wont be pixilated we are making the width of the images 1800px wide (which should cover most screens, but isn't actually big enough if the site was viewed on a 27" imac) these 1800px wide images weight about 350kb - 500kb per image and our image slider has about 20 of them. As you can see this would be a problem for anyone with a connection slower than c.10 mbps. This is especially true for mobile devices that will be downloading an image 1800px wide although only require a much smaller one, this coupled with a 3g connection will make the site really slow.
Web Design | | Sam-P0 -
How Does Google differentiate a keyword you are optimizing for and a non-keyword?
So, let's say that my company is called John's Business Consulting and I offer outsourced HR work (recruiting, evaluating, personality assessments, background checks). So for my home page I want "Business Consulting" to be my keyword that I want to rank for. But "recruiting services", "talent development" are all words that describe a service that I offer and could potential be keywords, how do I get Google to not dilute my authority for "business consulting"?
Web Design | | wlw20090