Question #1: Does Google index https:// pages? I thought they didn't because....
-
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored)
My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one.
The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/
instead of
**http://**www.example.com/example-page/
To double check that this was causing a loss in Link Juice. I jumped over to OSE.
Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed.
So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed...
Right??
Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed.
The problem is.. is this a volusion problem?
Should I switch to Wordpress?
here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress)
-
Hi Tyler
Looks like the duplicate title tags are largely from empty pages like these:
http://www.uncommonthread.com/008-Pink-Ice-p/14410008.htm
http://www.uncommonthread.com/001-Gold-p/14410001.htm
http://www.uncommonthread.com/019-Copper-p/14410019.htm
http://www.uncommonthread.com/027-Electric-Blue-p/14410027.htm
Even though these pages are somewhat unique, the content is definitely "thin" and having a lot of pages like this typically isn't good for rankings.
Ideally, you would list small product variations on the same page, or even have several similar product pages canonical to a master page. Generally if you don't have a 200 words minimum of good editorial content, Google might consider it duplicate.
I don't see any reason why switching to http should cause too much problem if you passed everything through a 301 redirect. To be honest, it's typical for rankings to fluxuate frequently so it could be a million things.
If I look at the text-only cache of the page you sent: http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com&strip=1
... it looks pretty similar. If it were my page, I'd probably try to include more descriptive text on the page, richer descriptions, ets.
Hope this helps!
-
Wow. What an awesome answer.
I literally don't know if I can't thank you enough for taking the time to answer so wholesomely.
I decided to go ahead and fix the https:// and change it to http://
Weird results here.. Traffic went down by 5.5% compared to the month before I posted this thread.
I noticed an increase in duplicate title tags (about 700 - 1000 of them) in my seomoz account.
Could that be likely to be the reason for the decrease? Or is it just because I shouldn't have made such a drastic site-wide change like that?
I am attempting to give unique title tags, and html titles to all of those product pages that are causing the increases in duplicate titles
I also am in a slight predicament because she hired another company to do some "optimization" around October 23rd.
Since then, they have made some spammy changes in my opinion, but some results have shown up (20+% increase starting around Jan 1st, and capping on the day I made the https:// change), and I can't get her to agree with me that we should invest in building a Social following, making better content, and blogging more often, etc. I also think we should move the blog to be in a sub folder on the domain as well..
I compared the webcache you showed me to a wordpress site that i built and the difference really was pretty shocking
http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com
whats the difference as far as rankings and SEs are concerned?
-
Hi Tyler,
Great question! In fact, it's a common misconception that Google doesn't index https. In truth, these days they appear to index most https just fine.
If we do a site operator Google search for https on your site, we get something like this:
site:uncommonthread.com/ inurl:https (click to see results)
This returns 165 URLs on your site with the https protocol.
But.... these URLs don't show up in OSE because at this time, the Linkscape crawler can't crawl https. When it was originally built Google still didn't index https, so it wasn't needed. This should be fixed in just a few months and you should start seeing those https results in there. The good news is that OSE is completely separate from Google and doesn't influence your rankings in any way.
Now for the bad news....
Whenever you have https, you want to make sure you only have ONE version of the url, so that https either redirects (via 301) to the http version, or vice versa. Otherwise Google might index both versions. For example, both of these URLs resolve on your site:
https://www.uncommonthread.com/kb_results.asp?ID=5
http://www.uncommonthread.com/kb_results.asp?ID=5
The solution is to either 301 redirect one to the other, or have an absolute canonical tag on both pages that points to one or the other (an absolute canonical means it contains the full URL, including http or https)
That said, I don't see any evidence that Google has indexed both URL versions of your site (at least not like Dunkin Donuts
Should You Switch to Wordpress?
Based simply on the https issue, switching to Wordpress isn't necessary. But Wordpress does offer other advantages, and is generally a very SEO friendly platform.
There might be other considerations you may consider to switch away from your current CMS.
For example, consider Google's Text-only cache of your homepage: http://webcache.googleusercontent.com/search?q=cache:http://www.uncommonthread.com/default.asp&strip=1
See how barren it is? Without taking a deep dive, it's possible the structure and technology employed by your CMS is causing indexing/crawling issues, and considerable technical effort may be required to make it SEO friendly. I can't give you a definite answer either way, but it's something to think about.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
19 Hours Excessive to Code Single Wordpress Page?
My developer says that is will take 19 hours to modify a listing page of the wpcasa London real estate theme because the existing template is difficult to customize. I am attaching an image of the existing page before customization and an image of a final mock up. Is 19 hours a reasonable amount of time to customize this page? Look forward to feedback. New Design is visible at: https://imgur.com/a/42XBqDD Alan IQ1i0kg
Web Design | | Kingalan10 -
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
'Security error' for links accessed via Facebook on Android phones
Hi, This is not strictly a SEO/inbound marketing question, so please excuse me for that--- but I think this awesome community could certainly help 🙂 We recently migrated a client website to https (SSL from Godaddy; the hosting provider is a different one). All that went fine. The problem though is that when a link from the website is shared on Facebook or sent via Whatsapp, and a user tries to open the page on any Android device, it throws up a Security Error. On the Facebook app, it doesn't allow the user to go any further. It seems that this problem is not unique and many others have raised it in various forums -- we've tried many of the options mentioned; have tried to work with Godaddy support as well ---- but the problem persists. Any solution(s)/fixes will be greatly appreciated. Thanks, Manoj
Web Design | | ontarget-media0 -
Why is my homepage not indexed by Google or Bing
http://www.schoppnutritionclinic.com/ Home page is not indexed by Google or Bing but all other pages are indexed. I know that currently i am missing the robot.TXT file and the sitemap. This is something i am woking on as a possible solution. I would have thought Google/Bing would have still indexed this page regardless of the lack of sitemap/robot.txt files not being present. I attempted to run a fetch and render in Webmaster tools and received a Not Found status.
Web Design | | ChrisSams0 -
Can the website pages have the site name like Title of the page | Sitename.com
Hi, Can the website pages have the site name like Title of the page | Sitename.com I have a site with 50K pages and all pages have | Sitename.com mentioned would that be a good practice or bad? Thanks Martin
Web Design | | mtthompsons0 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
SEO ethical practice in question
A family friend asked me to take a look at her website. www.designsbymaida.com First thing i noticed is what seemed a 301 redirect or a forward to http://funktionaldesignstudios.com/dbm-old/ So her site is hosted with what it looks like, in his site(funktonaldesigns). What does this means in terms of how google sees her site and in terms of SEO. My thought is that he is boosting his domain name. He is getting the link juice. Thanks for the insight and help.
Web Design | | QualityHosting1 -
Do iFrames embedded in a page get crawled?
Do iFrames embedded in a page get crawled? I have an iFrame which prints a page hosted by another company embedded in my page. Their links don't include rel=nofollow attributes, so I don't want Google to see them. Do spiders crawl the content in iFrames, or do I have to ensure that the links on this page include the nofollow attribute?
Web Design | | deuce1s0