Do I need to implement canonical in "https" or secured pages?
-
Thanks in advance!
-
Hi,
If you want those pages to be indexed and rank well, and there is a possibility of duplicate content between the secured and non-secured versions (or other content), you should implement the tag. Google crawls HTTPS pages (a simple search for inurl:HTTPS will show the extent of this), although if the pages are behind check-outs or log-ins, blocked by robots.txt, etc. and otherwise not available for crawling, there is no need to use the tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page drops from index completely
We have a page that is ranking organically at #1 but over the past couple of months the page has twice dropped from a search term entirely. There don't appear to be any issues with the page in Search Console and adding the page on https://www.google.com/webmasters/tools/submit-url seems to fix the issue. The search term we're tracking that drops is in the URL for the page and is the h1 of the page. Here is a screenshot of the ranking over the past few months: https://jmp.sh/akvaKGF What could cause this to happen? There is nothing in search console that shows any problems with the page. The last time this happened the page completely dropped on all search terms and showed up again after submitting the url to google manually. This time it dropped on just one search term and reappeared the next day after manually submitting the page again. akvaKGF
White Hat / Black Hat SEO | | russell_ms0 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
How can I 100% safe get some of my keywords ranking on second & third page?
Hi, I want to know how can I rank some of my keywords which are in the second and third page on google on page one 100% save, so it will pass all penguin, pandas etc as quick as possible? Kind Regards
White Hat / Black Hat SEO | | rodica70 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
What to do about "Penguin" penalties?
What are your suggestions about moving forward with sites hit by the "Penguin" penalties? Wait it out and see if the penalty goes away Try to remove spammy backlinks and resubmit (is this worth the time and effort) Build quality backlinks to offset (will this even work if they have thousands of spammy links) Blog more (I think this is probably a no brainer) Scrap the site and start from scratch (This is last resort and don't want to do this if at all possible) Or any other ideas are greatly appreciated
White Hat / Black Hat SEO | | RonMedlin0 -
The Link in Profile Page is it good BackLink or not?
Well, i see that we need 200mozpoints to be able to put our Website Link (DOFOLLOW) into our profile in SEOMOZ.. the way i know it, it would be a good BackLink for my site. Here is the questions, please do answer from top to bottom, because if you have answered "NOT GOOD" for the first question, then the rest of the question will definitely be "NOT GOOD" too Every single back link source i used below (for question #2 and #3), comes from a good domain (it is an extremely wellknown website in Indonesia) 1. Is the DOFOLLOW link from my SEOMOZ Profile Page , a good back link? 2. is the DOFOLLOW from http://www.indonesiaindonesia.com/m4g1c14n a good back link 3. is the DOFOLLOW from http://www.kaskus.us/member.php?u=10407 (click the Contact Info), a good back link? okay, only if you answered the first 3 questions with "It is a good backlink, and it will definitely help your SEO Standing for your site", then i ask you my real question.. i was planning to use the service from http://www.monsterbacklinks.com , and i asked them to show me what kind of "High Quality Backlink" they will be giving me, here is their reply, 10 examples of profile they use to backlink to one of their client Domain PR 4--http://www.sanramon.org/user/12548
White Hat / Black Hat SEO | | IKT
Domain PR 5--http://extratasty.com/profile/42069/paulc4312
Domain PR 5--http://www.bug.co.uk/forums/members/paulc4312.html
Domain PR 5--http://www.offspring.com/forums/member.php?u=84973
Domain PR 5--http://www.massify.com/profiles/paulcpaul
Domain PR 6--http://www.gamezone.com/member/159751/
Domain PR
5--http://www.indyarocks.com/profile/profile_vview_main.php?uid=6155724
Domain PR 6--http://classic.mapmywalk.com/user_profile?u=866130762956343886
Domain PR 5--http://www.netbookreviews.com/forum/members/paulc4312.html
Domain PR 5--http://www.thepoint.com/users/paul-c-2/profile
Domain PR 5--http://forums.cagepotato.com/members/paulc4312.html In my eyes, all of those links are as good as the one link coming from SEOMOZ Profile, hell in fact i have already purchased from them the 750 High Quality BackLink package (cost 197$), but my PayPal is being lock down just now, because i login to my account from both my cellphone and pc (they think my account is hacked)... so will i increase my SEO Standing if i used their service? if they are, i will finalized my purchase tomorrow (after i settled the problem with paypal) Their FAQ Page is also very convincing .. such these 2 questions Will I get penalized for paying you to do my backlinks? There is no way you will get penalized for paying us to do your backlinks. It is possible to get penalized for paying people to put links on their sites but that's not what you're buying from us. When purchasing from us you are paying us to place thousands of free backlinks. There is absolutely no way Google can penalize you for this. Will Google ban/sandbox me for getting so many backlinks? We have never had any problems with getting sanboxed or banned by google. None of our customers have had any problems either. If our methods of placing backlinks were to get a site penalized or banned then we would be sending thousands of links towards our competitors sites. But since our methods work great for increasing search engine rankings, we would never use our backlinking on our competitors because that will damage our rankings and boost theirs. Please enlighten me 🙂0 -
A domain is ranking for a plural key word in SERPs on page 1 but for the singular not at all?
What could the reasons that a domain is ranking for the plural version of a key word on SERPs page 1 and for the singular version not at all? Google knows that both key words belong together, as in the SERPs for one version also the other version of the key word is being highlighted. If I search for the domain with the plural keyword it shows up on the first page in SERPs, but If I search for the same keyword as singular (in German it is just removing an “s”) I see the plural version highlighted many times but I cannot find my domain. What could be the reason for this behavior? penalties?
White Hat / Black Hat SEO | | SimCaffe0