Question #1: Does Google index https:// pages? I thought they didn't because....
-
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored)
My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one.
The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/
instead of
**http://**www.example.com/example-page/
To double check that this was causing a loss in Link Juice. I jumped over to OSE.
Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed.
So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed...
Right??
Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed.
The problem is.. is this a volusion problem?
Should I switch to Wordpress?
here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress)
-
Hi Tyler
Looks like the duplicate title tags are largely from empty pages like these:
http://www.uncommonthread.com/008-Pink-Ice-p/14410008.htm
http://www.uncommonthread.com/001-Gold-p/14410001.htm
http://www.uncommonthread.com/019-Copper-p/14410019.htm
http://www.uncommonthread.com/027-Electric-Blue-p/14410027.htm
Even though these pages are somewhat unique, the content is definitely "thin" and having a lot of pages like this typically isn't good for rankings.
Ideally, you would list small product variations on the same page, or even have several similar product pages canonical to a master page. Generally if you don't have a 200 words minimum of good editorial content, Google might consider it duplicate.
I don't see any reason why switching to http should cause too much problem if you passed everything through a 301 redirect. To be honest, it's typical for rankings to fluxuate frequently so it could be a million things.
If I look at the text-only cache of the page you sent: http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com&strip=1
... it looks pretty similar. If it were my page, I'd probably try to include more descriptive text on the page, richer descriptions, ets.
Hope this helps!
-
Wow. What an awesome answer.
I literally don't know if I can't thank you enough for taking the time to answer so wholesomely.
I decided to go ahead and fix the https:// and change it to http://
Weird results here.. Traffic went down by 5.5% compared to the month before I posted this thread.
I noticed an increase in duplicate title tags (about 700 - 1000 of them) in my seomoz account.
Could that be likely to be the reason for the decrease? Or is it just because I shouldn't have made such a drastic site-wide change like that?
I am attempting to give unique title tags, and html titles to all of those product pages that are causing the increases in duplicate titles
I also am in a slight predicament because she hired another company to do some "optimization" around October 23rd.
Since then, they have made some spammy changes in my opinion, but some results have shown up (20+% increase starting around Jan 1st, and capping on the day I made the https:// change), and I can't get her to agree with me that we should invest in building a Social following, making better content, and blogging more often, etc. I also think we should move the blog to be in a sub folder on the domain as well..
I compared the webcache you showed me to a wordpress site that i built and the difference really was pretty shocking
http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com
whats the difference as far as rankings and SEs are concerned?
-
Hi Tyler,
Great question! In fact, it's a common misconception that Google doesn't index https. In truth, these days they appear to index most https just fine.
If we do a site operator Google search for https on your site, we get something like this:
site:uncommonthread.com/ inurl:https (click to see results)
This returns 165 URLs on your site with the https protocol.
But.... these URLs don't show up in OSE because at this time, the Linkscape crawler can't crawl https. When it was originally built Google still didn't index https, so it wasn't needed. This should be fixed in just a few months and you should start seeing those https results in there. The good news is that OSE is completely separate from Google and doesn't influence your rankings in any way.
Now for the bad news....
Whenever you have https, you want to make sure you only have ONE version of the url, so that https either redirects (via 301) to the http version, or vice versa. Otherwise Google might index both versions. For example, both of these URLs resolve on your site:
https://www.uncommonthread.com/kb_results.asp?ID=5
http://www.uncommonthread.com/kb_results.asp?ID=5
The solution is to either 301 redirect one to the other, or have an absolute canonical tag on both pages that points to one or the other (an absolute canonical means it contains the full URL, including http or https)
That said, I don't see any evidence that Google has indexed both URL versions of your site (at least not like Dunkin Donuts
Should You Switch to Wordpress?
Based simply on the https issue, switching to Wordpress isn't necessary. But Wordpress does offer other advantages, and is generally a very SEO friendly platform.
There might be other considerations you may consider to switch away from your current CMS.
For example, consider Google's Text-only cache of your homepage: http://webcache.googleusercontent.com/search?q=cache:http://www.uncommonthread.com/default.asp&strip=1
See how barren it is? Without taking a deep dive, it's possible the structure and technology employed by your CMS is causing indexing/crawling issues, and considerable technical effort may be required to make it SEO friendly. I can't give you a definite answer either way, but it's something to think about.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you changed 100's of links on your site? Tell me the why's, the how's and what's!
Hello there. If you've changed 100's of links, then I'd like for you to contribute to this thread. I've created a new URL structure for a website with 500+ posts in an effort to make it more user friendly, and more accessible to crawlers. I was just about to pull the trigger, when I started reading up on the subject and found that I might have a few surprises waiting for me around the corner. The status of my site. 500 posts 10 different categories 50+ tags No Backlinks No recent hits (according to Google Analytics) No rankings. I'm going to keep roughly 75% of the posts, and put them in different (new) categories to strengthen SEO for the topic which I'd like to rank multiple categories for, and also sorted a list with content which I'd like to 410. Created new structure created new categories Compiled list of old URLs, and new URLs New H1, Meta Title & Descriptions New tags It looks simple on paper, but I've got problems executing it. **Question 1. **What do I need to keep in mind when deleting posts, categories, and tags - besides 410, Google URL removal? Question 2. What do I do with all the old posts that I am going to re-direct? Each post has between 10-15 internal links. I've started manually removing each link in old posts before 301'ing them. The reason I'm doing this is control the UX, as well as internal link juice to strengthen main categories. Am I on the right path? On a side note, I've prepared for the 301'ing by changing the H1's, meta data and adding alt text to images. But I can't help but to think that just deleting the old posts, and copying over the content to the new url (with the original dates set) would be a better alternative. Any contribution to this thread would be greatly appreciated. Thank you in advance.
Web Design | | Dan-Louis1 -
Making html table as 'seofriendly' as possible
Hi, On my website I have a table with a list of products, on every row I have a different product and a different property on each column. The table is made with css so the html code is clean. The problem is (I guess) that google doesn't 'understand' what its inside on the table. So if I do a google search that page appears on the page 87, there is any way to improve my SEO without changing the table? Or to improve my SEO I must change the format of my content? In resume, I want to improve the SEO page of a page that contains information organized inside a table. I don't know if there is a specific answer to this question. Any help is welcome. Regards
Web Design | | jcobo0 -
Fetch as Google not showing Waypoints.js on scroll animation
So I noticed that my main content underneath 4 reasons to choose LED Habitats did not show up in Fetch as Google as well as a few other sections. The site being brand new, so I'm not sure how this will be indexed. What happens is, as the user scrolls the content is brought in using Waypoints and Animate.css which offers an engaging yet simple user experience. I'm just afraid that If the content doesn't show up in "Fetch as Google" in webmaster tools that this content will never be found / indexed by Google. There are thousands of sites that use this library, I'm just curious what I'm doing wrong.. or what I can do. Is there a way for me to keep the simple animations but keep Google Happy at the same time? I took a screen shot of "Fetch as Google" and you can see blatant missing sections which are the sections animated by the waypoints library. Thanks for listening! Robert ZqgLWHi
Web Design | | swarming0 -
Thoughts about Comment Luv?
What do you think about the Comment Luv pro plugin for your blog? My goal is this would help me get traffic and comments on my posts..... Make it easy to promote my own content.... the unlimited version is $97, I thought about purchasing this for my site and all my clients but wasn't sure if this was just one more tool that I spend money on and 6 months later I don't use it because I have learned about a better option (that is usually free). I am sure many of you can relate to doing that 🙂 Thank you very much for any suggestions!!! Have a superb week everybody! Matthew
Web Design | | Mrupp440 -
How to make sure category pages rank higher than product pages?
Hi, This question is E-Commerce related. We have product categories dividing products by color. Let's say we have the category 'blue toy cars' and a product called 'blue toy car racer', both of these could rank for the keyword 'blue toy car'. How do we make sure the category 'blue toy cars' ranks above the product 'blue toy car racer'? Or is the category page automatically ranked higher because of the higher page authority of that page? Alex
Web Design | | WebmasterAlex0 -
Google result showing old Meta Title / Description even though page view source shows new info.
Hey guys! I'm struggling with why Google is ignoring my Meta Title / Description. I made a pretty drastic change to both about a week ago and on the results it hasn't changed. I'm on first page with several keywords and I think this weird caching is hurting me on where I'm at on the page. Thoughts / Ideas?
Web Design | | curtis_williams0 -
Homepage Title Question? Multi-Keywords or All Encompassing Keyword
Okay so I am currently redesigning my company's webpage. I am making it responsive and giving it a more up to date look with newer features, etc. A facelift, basically. While updating the site i'm also doing some on-page optimization here and there, and am curious about the page title for my homepage. My company offers video production, web development & design, and web marketing. While we do offer each service individually, we are really trying to sell the combination of all three services to our clients and show them how they can work together effectively. Now my question is, in my homepage title, should i list each service offering keyword (which is what i do now) like this : "Video Production - Web Design - Web Marketing • Company Name" Or, should i try to find one keyword that kind of sums up what we do, like this: "Magic All-Encompassing Keyword • Company Name" I'm thinking that since three sort of unrelated keywords are in the page title, it may be viewed as over-optimizing and we won't see as good of results as just focusing on one keyword, which leads me to think that i should try to sum all of our services into one "all-encompassing" keyword such as "media production", which isn't the best choice, i'm just throwing it out there for the sake of this discussion. Any thoughts or ideas would be greatly appreciated. Thanks.
Web Design | | RenderPerfect0 -
Two home pages?
One of my campaigns shows duplicate page content for domain xxx and xxx/index. There is only one index (home) page, so why does it report on two?
Web Design | | Beemer0