Do I need to implement canonical in "https" or secured pages?
-
Thanks in advance!
-
Hi,
If you want those pages to be indexed and rank well, and there is a possibility of duplicate content between the secured and non-secured versions (or other content), you should implement the tag. Google crawls HTTPS pages (a simple search for inurl:HTTPS will show the extent of this), although if the pages are behind check-outs or log-ins, blocked by robots.txt, etc. and otherwise not available for crawling, there is no need to use the tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Fake" market research reports killing SEO
Our robotics company is in a fast growing, competitive market. There are an assortment of "market research" companies who are distributing press releases about their research reports (which are of less than dubious quality). These announcements end up being distributed through channels with high domain authority. The announcements mention many companies in the space that the purported report covers - including ours. As a result, our company name and product brand is suffering since the volume of press announcements is swamping our ratings. What would you do? Start writing blog postings on topics and post through inexpensive news feeds? Somehow contact the firms posting the contact and let them know they are in violation of our trademarks by mentioning our name? Other ideas?
White Hat / Black Hat SEO | | amelanson1 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
How to 301 redirect from old domain and their pages to new domain and pages?
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me. I would like to 301 redirect this way, both website contain the same niche. oldwebsite.com > newwebsite.com and also its pages..... oldwebsite.com/test >newwebsite.com/test So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages how do i do that? would my previous page link overwrite my new page link? or it add on the juice link? Do i need to host the whole old domain website into my new hosting in order to redirect the old pages? really confusing here, thanks!
White Hat / Black Hat SEO | | andzon0 -
Website that just got hit....Need some tips or ideas...
Hey guys, The website of the company i work hit in the PR update two days ago . A little history , the site was notice by Google about spam links around 5-6 months ago .
White Hat / Black Hat SEO | | WayneRooney
Since then there is a company that cleans all the spam links and manage all the disavow process. In the last penguin update ( about two months ago ) the site jumped like crazy in the ranking and stayed there ever since... In the last three months we create less than ten links to the site, and we have focus all our work to improve
the optimization of the site.
It should be noted that the company is investing a lot in social networks and all the work in the past 3 month are White and clean... Now, two days ago in the PR update (more or less) the site just dropped , but when i say dropped , it's 200 keys that was in page 1-2 that just want out to page 5-6-7. Like the website is gone, i never see something like this... The things that pass through my head: A lot of the links the linking to the site with high PR lost their pr and now they are worthless, but still this drop ? its to extreme.
Or that Google received the disavow and just disavow a lot of links.... Does anyone have any ideas or tips on the subject ? Thank you0 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Shadow Page for Flash Experience
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
For traffic sent by the search engines, how much personalization/customization is allowed on a page if any?
If I want to better target my audience so I would like to be able to address the exact query string coming from the search engine. I'd also like to add relevant sections to the site based in the geo area they live in. Can I customize a small portion of the page to fit my visitors search query and geo area per the IP address? How much can I change a web page to better fit a user and still be within the search engine's guidelines?
White Hat / Black Hat SEO | | Thos0030