Does Google have problem crawling ssl sites?
-
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https.
My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
-
Thanks for the replies. Think we have the http fixed and will work on footer area next. Thanks again for the heads up.
-
Google crawls and indexes secure pages - for instance, your https website has 128 pages indexed in Google. I agree with Mr. Weiss that your site needs to be fixed however - especially the temporary redirect from http to https. Was your entire website only recently moved to https and then you lost ranking or has it always been https? I feel you probably have some other issues going on besides the secure socket layer. For instance, almost every link in your footer nav has 'dog tags' in it. Kind of spammy to me....
-
I would tell your webmaster to fix the site, and make only the pages needing to be secure https.
also:
Checked link: http://www.dogtagsinc.com
Type of redirect: 302 Found
Redirected to: https://www.dogtagsinc.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Looking to remove SSL because it is causing very slow website download speeds. Does WP have a plugin that redirects SSL urls to non SSL urls?
After some extended debate with our web development team we are considering dropping the SSL from our website because it is adding almost 2 additional seconds to our download speeds. We know there is a SEO boost from having a SSL but we believe the extended download speeds maybe outweighing the benefit. However we are concerned about the SEO implications of having no method possible of redirect SSL to non SSL webpages. Does anybody know of a Wordpress Plugin that can force redirect SSL urls to non SSL urls?
Web Design | | RosemaryB0 -
What causes rankings to drop while moving a site.
Hi, we recently moved a PHP based site from one web developer to another (switched hosting providers as well). Amidst the move our rankings drastically dropped and our citation and trust flow were literally cut in half as per Majestic SEO. What could have caused this sudden drop?
Web Design | | Syed_Raza0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Does anyone know how much a wordpress site can store (in terms of data) I want to put all my movies on it and use it as a personal global external hard drive! Thanks!!
So basically, I have about 500 GB of movies on my computer and I don't want to buy an external hardrive. I don't want to spend the money A website I could access anytime, and anywhere, without having to carry my external with me everywhere I go. Thanks in advance for any help/ references.
Web Design | | TylerAbernethy1 -
How to verify http://bizdetox.com for google webmaster tools
Hey guys i tried to to make a Preferred Domain choice in webmaster tools, but it is not allowing me to save my choice bec its asking me to verify that i own http://bizdetox.com How do i go about doing that and what are the steps I have already verified www.bizdetox.com
Web Design | | BizDetox0 -
Best way to set up a site with multiple brick and mortor locations across Canada
I have a client who is expanding his business locations from 2 cities to 3, and working towards having 10+ locations across Canada. Right now we're building location based landing pages for each city, as well as keyword targeted landing pages for each city. For example, landing pages for "Vancouver whatever clinic" and "Calgary whatever clinic" as well as for "Vancouver specific service", and "Calgary specific service". This means a lot of landing pages will need to be created to target each of 10 or so desirable "service" keywords for each city's location. I've no issue with this, however I was wondering how other companies go about this? What's the best way to be relevant for certain "service" based keyword searches in each city? Many of the "service" keywords are 'localized' meaning they will show Google Places results for local brick and mortar businesses for each location. I'm quite good at optimizing locally for this type of thing. However, many of the "service" keywords are not yet 'localized' by Google, I'd want to have my client webpages show well in the SERP's. for these 'non-localized' "service keywords" as well. the new site will be built in WordPress
Web Design | | AndyKuiper0