Does Google have problem crawling ssl sites?
-
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https.
My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
-
Thanks for the replies. Think we have the http fixed and will work on footer area next. Thanks again for the heads up.
-
Google crawls and indexes secure pages - for instance, your https website has 128 pages indexed in Google. I agree with Mr. Weiss that your site needs to be fixed however - especially the temporary redirect from http to https. Was your entire website only recently moved to https and then you lost ranking or has it always been https? I feel you probably have some other issues going on besides the secure socket layer. For instance, almost every link in your footer nav has 'dog tags' in it. Kind of spammy to me....
-
I would tell your webmaster to fix the site, and make only the pages needing to be secure https.
also:
Checked link: http://www.dogtagsinc.com
Type of redirect: 302 Found
Redirected to: https://www.dogtagsinc.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Can I Make My Site iPhone Friendly?
I have been looking into making my website for iphone friendly as my analytics are not great for the iphone and I know when I try to navigate around it on an iphone it can be tough. I was told that if I make changes to the layout that it would affect my layout across everything, which I did not want to do. So I have two questions: Is this correct regarding the layout? If so, if you did something like m.waikoloavacationrentals.com which would be the mobile version how would that possibly effect your rankings with regards to the traffic distribution? Any feedback would be appreciated. Also if anyone has any experience in doing this I would be interested in discussing further.
Web Design | | RobDalton0 -
Are URL suffixes ignored by Google? Or is this duplicate content?
Example URLs: www.example.com/great-article-on-dog-hygiene.html www.example.com/great-article-on-dog-hygiene.rt-article.html My IT dept. tells me the second instance of this article would be ignored by Google, but I've found a couple of instances in which Google did index the 'rt-article.html' version of the page. To be fair, I've only found a couple out of MANY. Is it an issue? Thanks, Trisha
Web Design | | lzhao0 -
Is anyone here managing or doing SEO for a site using GoECart?
We are preparing to update/migrate to a new ecommerce platform. We are in the process of choosing right now. One of the things we know we want is faceted navigation, but I am well aware of the problems this presents for SEO. Are any of you amazing people here using, managing or have experience with GoECart? I am interested to know your feedback, particularly from an SEO viewpoint. Thanks in advance! Dana
Web Design | | danatanseo0 -
Will updating our site from ASP .NET 3.5 to ASP .NET 4.0 negatively affect SEO?
I've checked out some of the other posts related to .NET upgrades, but none specifically address ASP .NET 4.0. I understand that there are many advantages to upgrading, but as with any change made to site code I want to be 110% positive that this upgrade will not affect how Google ranks my client's pages. Since the URL extension isn't changing (will remain .aspx), I'm thinking that there won't be much of an affect on SEO at all. In fact, I'm making the argument that the upgrade will only improve page rank. Anyone go through this upgrade and experience any immediate benefits or disadvantages? Thanks for your help!
Web Design | | FreightTEK0 -
Tips on website redesign on site with messy URLs?
So I've inherited quite a messy website. It was in drupal and the owner wants it in wordpress. One of the problems is the link paths. Should I try to recreate them exactly? i.e. something/somethingelse/page/ or use redirects (which I'm not confident in doing). Also, some of the pages end in .html, others in a back slash and others without slahes, there's no consistency. Do you have any tips in general? I remember an older seomoz blogpost about successful website relaunches (with press releases and mass emails and stuff being sent out on launch to boot). Thanks!
Web Design | | seonubblet0 -
Wordpress Pages not indexing in Google
Hi, I've created a Wordpress site for my client. I've produced 4 content pages and 1 home page but in my sitemap it only says I have 1 page indexed. Also SEOmoz only finds 1 page. I'm lost on what the problem could be. The domain name is www.dobermandeen.co.uk Many thanks for any help. Alex
Web Design | | SeoSheikh0 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0