Question #1: Does Google index https:// pages? I thought they didn't because....
-
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored)
My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one.
The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/
instead of
**http://**www.example.com/example-page/
To double check that this was causing a loss in Link Juice. I jumped over to OSE.
Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed.
So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed...
Right??
Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed.
The problem is.. is this a volusion problem?
Should I switch to Wordpress?
here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress)
-
Hi Tyler
Looks like the duplicate title tags are largely from empty pages like these:
http://www.uncommonthread.com/008-Pink-Ice-p/14410008.htm
http://www.uncommonthread.com/001-Gold-p/14410001.htm
http://www.uncommonthread.com/019-Copper-p/14410019.htm
http://www.uncommonthread.com/027-Electric-Blue-p/14410027.htm
Even though these pages are somewhat unique, the content is definitely "thin" and having a lot of pages like this typically isn't good for rankings.
Ideally, you would list small product variations on the same page, or even have several similar product pages canonical to a master page. Generally if you don't have a 200 words minimum of good editorial content, Google might consider it duplicate.
I don't see any reason why switching to http should cause too much problem if you passed everything through a 301 redirect. To be honest, it's typical for rankings to fluxuate frequently so it could be a million things.
If I look at the text-only cache of the page you sent: http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com&strip=1
... it looks pretty similar. If it were my page, I'd probably try to include more descriptive text on the page, richer descriptions, ets.
Hope this helps!
-
Wow. What an awesome answer.
I literally don't know if I can't thank you enough for taking the time to answer so wholesomely.
I decided to go ahead and fix the https:// and change it to http://
Weird results here.. Traffic went down by 5.5% compared to the month before I posted this thread.
I noticed an increase in duplicate title tags (about 700 - 1000 of them) in my seomoz account.
Could that be likely to be the reason for the decrease? Or is it just because I shouldn't have made such a drastic site-wide change like that?
I am attempting to give unique title tags, and html titles to all of those product pages that are causing the increases in duplicate titles
I also am in a slight predicament because she hired another company to do some "optimization" around October 23rd.
Since then, they have made some spammy changes in my opinion, but some results have shown up (20+% increase starting around Jan 1st, and capping on the day I made the https:// change), and I can't get her to agree with me that we should invest in building a Social following, making better content, and blogging more often, etc. I also think we should move the blog to be in a sub folder on the domain as well..
I compared the webcache you showed me to a wordpress site that i built and the difference really was pretty shocking
http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com
whats the difference as far as rankings and SEs are concerned?
-
Hi Tyler,
Great question! In fact, it's a common misconception that Google doesn't index https. In truth, these days they appear to index most https just fine.
If we do a site operator Google search for https on your site, we get something like this:
site:uncommonthread.com/ inurl:https (click to see results)
This returns 165 URLs on your site with the https protocol.
But.... these URLs don't show up in OSE because at this time, the Linkscape crawler can't crawl https. When it was originally built Google still didn't index https, so it wasn't needed. This should be fixed in just a few months and you should start seeing those https results in there. The good news is that OSE is completely separate from Google and doesn't influence your rankings in any way.
Now for the bad news....
Whenever you have https, you want to make sure you only have ONE version of the url, so that https either redirects (via 301) to the http version, or vice versa. Otherwise Google might index both versions. For example, both of these URLs resolve on your site:
https://www.uncommonthread.com/kb_results.asp?ID=5
http://www.uncommonthread.com/kb_results.asp?ID=5
The solution is to either 301 redirect one to the other, or have an absolute canonical tag on both pages that points to one or the other (an absolute canonical means it contains the full URL, including http or https)
That said, I don't see any evidence that Google has indexed both URL versions of your site (at least not like Dunkin Donuts
Should You Switch to Wordpress?
Based simply on the https issue, switching to Wordpress isn't necessary. But Wordpress does offer other advantages, and is generally a very SEO friendly platform.
There might be other considerations you may consider to switch away from your current CMS.
For example, consider Google's Text-only cache of your homepage: http://webcache.googleusercontent.com/search?q=cache:http://www.uncommonthread.com/default.asp&strip=1
See how barren it is? Without taking a deep dive, it's possible the structure and technology employed by your CMS is causing indexing/crawling issues, and considerable technical effort may be required to make it SEO friendly. I can't give you a definite answer either way, but it's something to think about.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Toggle Tabs on pages - How to present information to users
Hi all, I can use some help with SEO/UX related question I have got. I have a client who has some toggel tabs on its website. Is there a way to display the relevant information from these toggle tabs when a user lands on the page instead of having the same toggle tab show for whenever a user reaches the page? What I am trying to understand is that if a user searched for "vitamin C benefit" (lets say) in Google and then clicks on the link, the user is presented with the "benefits" tab on the page instead of "side effects" tab. Any help would be appreciated! Thanks
Web Design | | Malika1
Malika0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
How to split organic traffic for A/B testing
This might be a silly questions as I may be missing something completely obvious here, but we are completely new to A/B testing. Our site doesn't receive a phenomenal amount of traffic although we are looking to set up some A/B testing for our popular products. Is there a way to split organic traffic for a specific product page. I'm aware that we need to experiment which one performs better in Analytics but I'm unsure how to redirect 50% of the organic traffic.
Web Design | | Jseddon920 -
Managing website content/keywords for wordpress site
We are in the midst of redesigning our website and have been working with freelance blog/content writers to increase the unique content on our site. We are finding it increasingly difficult to manage the topics/keywords as we continue to expand. Googledrive and google spreadsheets have been our primary tools thus far. Can anyone recommend a good tool that would allow us to manage content and blog posts for our site?
Web Design | | Tom_Carc0 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0 -
Too Many On-Page Links for my Blog
Hi, I have created a SEO Moz campaign for my travel blog www.EspaceVoyage.net. The Crawl diagnostics tool raised a warning saying that for few pages I have a 'Too many on-page links'. All the problematic pages are of the following style: http://www.espacevoyage.net/2008/08/01/ http://www.espacevoyage.net/2008/08/02/ http://www.espacevoyage.net/2008/08/06/ .... I am not sure what I should do with that ... Since I continue to publish articles on that blog I imagine that that problem will keep growing and growing ... What should I do with that? Thanks P.S. That blog uses Wordpress CMS. Nancy
Web Design | | EnigmaSolution0 -
How to make AJAX/javascript website more seo friendly?
I have a website that is heavy on AJAX. I need recommendations on how to add content as well as other on page ompitizations. The website is a luxury brand for 6 resorts, each with their own subfolder. The website is http://me.graficode.com/preprod/.
Web Design | | Melia0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0