On-Page Report Card with https
-
Hi,
Our site has a 301 redirect to https and I'm getting two different grades for my pages depending upon whether I type:
https://www.domain.com (gets an A grade)
or www.domain.com (gets a C grade)
Is there a setting I need to use to make sure my campaign knows our site is at https?
Thank you
-
Hi there,
I'd have to know the keywords you were using to see for sure. But I just tested a keyword from your campaign and saw similar results. It's most likely because When you run the report without the http you aren't getting the credit for the canonical tag you have.
When you run it for https, you are getting the credit for appropriate use of the rel canonical tag.
That's why you're seeing two different scores.If you want to get more in depth, you're welcome to provide keywords here or email help@seomoz.org
I hope that helps.
Cheers,
Joel. -
Hi Prasad,
Thanks for the reply!
It's actually not help with the grade that's the problem. It's the fact that www.ownmyip.com gets a C grade and https://www.ownmyip.com get an A grade. I'm wondering why the same page grades differently depending upon the use of https or http.
Maybe it doesn't matter to our SEO but clarification would be cool.
Thanks again.
-
Hi,
I have not find any place to set the domain is http or https. But you can have an idea about the reason to happen like this by go through the suggestion given by On-page Optimization tool.
Also better if you can post here, what are the suggestions you got with that C grade. Hope it will helpful for others to give you a good response.
Anyway I always suggest you to use actual url to check the on-page report when using the tool.
Regards
Prasad
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
Sub-pages have no pa
I took over a website a few months ago which is not performing well at all for chosen keywords. When I first inspected it, I found a rel canonical tag pointing to the homepage on every page. This was quickly deleted and all the pages were fetched in webmaster tools. 3 months later and the website is still performing badly. When I use the mozbar, it shows that all of the sub-pages have a pa of 1. It is only a small site and all of the pages are linked to on the navbar in a simple way. The links are not made using javascript and all the pages are on the sitemap which is submitted to wmt. I have checked that all of the changes that have been made have been indexed as well. Could it be possible that google still sees the canonical tag even though its not there? I can't think of any other reason why the pages have no pa or why it is so far behind the competitors despite having better content and links. Also, the site is appropriate for adults, but I found (among the mess left for me) a meta ratings tag set to "general". This has now been deleted, could it negatively affect rankings?
On-Page Optimization | | maxweb0 -
Help with the indexation of my page
Hi all, I have a problem with my website. When writing site:www.pinesapiensa.com there're no pages indexed although the webmaster tools tells me that the sitemap file has been processed in 13 May and the number of indexed paged are 21. ¿What could be happening? I have to mention that there are two domains "www.piensapiensa.es" and "www.piensapiensa.com" addressing the same website and there's a redirection from piensapiensa.com to piensapiensa.com but it doesn't work properly. Thanks
On-Page Optimization | | juanmiguelcr0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
On page report card instances of key terms in the body - I can't find them all
Hi guys, I'm using the on-page report card for a particular page which is returning 22 instances of my term, however when I check the source of the page, I can only find 6 instances across the whole page, let alone the body. The site is www.sportsbet.com.au, and the term "horse racing". I'm sure I'm missing something, would appreciate any explanation for this apparent discrepancy. Cheers, Jez
On-Page Optimization | | jez0000 -
Multiple silos/products/landing pages. How to design the root page for conversion?
Hi everyone, First post. Tried a few awkward searches on the topic but I must be using bad keywords. I'm re-designing a site that has multiple products and matching multiple audiences. This means we have multiple sillos for multiple groups of keywords with the supporting pages for each silo landing page. Currently I'm working on updating the look and text of those landing pages for each silo to increase conversion. This leaves me with the root web page. We get quite a lot of search traffic from people searching our brand name - so this results in clicks straight through to our root domain. There are no product specific landing pages because it could be any one of the 3-5 different personas we have hitting the site from that source. Does anyone have any good examples of where a site has had multiple products and needed to segregate their audience on a root top page? I'd like to see some examples and hear peoples thoughts. At the moment I'm thinking I need to fill that page up with trust factors as to why people should use us as a company, along with navigational elements in relation to each and every product so they can click through to the proper landing page. The main way I can see on executing that is to have a rotating banner with the same tag line "this is what we do" but be alternating between banners relating to each product.. with their own click through button to go to the respective landing page. Thoughts anyone? Example of sites doing this well?
On-Page Optimization | | specific0 -
On page links?
Hi all, Ive be going through the pages in my site getting rid of errors so i can the work of a clean slate and get the best for my site. However, i have a large amount of pages which is flagged up by seo moz pro tool as too many on page links. How bad is this in terms of seo rankings? Thanks
On-Page Optimization | | wazza19850 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5