Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should sitemap include https pages?
-
Hi guys,
Trying to figure out some onsite issues I've been having. Would appreciate any feedback on the following 2 questions:
My homepage (http://mysite.com) is a 301 redirect to https://mysite.com, which is under SSL. Only 2 pages of my site are https, the rest are http.
-
Should the directory of my sitemap be https://mysite.com/sitemap.xml or should it be kept with http (even though the redirected homepage is to https)?
-
Should my sitemap include the https pages (only 2 pages) as well as the http?
Thanks,
G
-
-
Hi Frederico,
On the google Sitemaps Errors help page, they include the following information:
"You should also check that the URLs all begin with the same domain as your Sitemap location. For instance, if your Sitemap is listed under http://www.example.com/sitemap.xml, the following URLs are not valid for that Sitemap:
http://www.google.com
— it's in the google.com domain rather than the example.com domainhttp://example.com/
— it's missing the initialwww
www.example.com/
— it's missing the protocol (http), and will generate an Invalid URL warninghttps://www.example.com/
— it's using a different protocol (https
rather thanhttp
)
Any URLs in the Sitemap that are not denied are processed normally."
This leads me to understand that Google don't want you to put http urls in an https sitemap and also vice-versa. What makes you believe otherwise??
Hoping to get to the bottom of this - thanks for the ongoing feedback
-
Those suggesting not to add the SSL pages to the HTTP sitemap are using data back from 2007, when indeed Google showed an error on those sitemaps listing both HTTP and HTTPS pages as they were being recognized as different domains. Those days are long gone. Google had evolved and can now handle sitemaps with both HTTP and HTTPS pages just fine.
-
Thanks for the input Frederico. I've been receiving various different answers to this question.
Most responses have said that we should submit 2 sitemaps: 1 sitemap listed under http that only includes the http pages of the site (which means we wouldn't include our homepage since it's under https!!!).
And 1 sitemap listed on the https version which only includes the https pages (which is only 2 pages!).
To be honest, I still don't know what to do here. Really frustrating that there is no clear cut answer to our situation, which I can't believe is even that unique.
-
G,
It wouldn't do any difference to serve the sitemap over HTTP or HTTPS. As for the http and https pages within the same sitemap, it isn't a problem either.
The only reason I can find for creating multiple sitemaps is for HTML pages, images or videos that do require separate sitemaps.
Does you site uses PHP? If yes, I suggest you test xml-sitemaps.com and it will create the full sitemap for you. If you have a dynamic site, then I suggest getting their commercial version. I've been using it for over 7 years I think and I always get a copy for each site I create. And they offer lots of extras in case you need them (news sitemaps, etc).
-
Hey Federico,
Thanks again for the insight - much appreciated.
So there's no problem for us to create a sitemap that has the https homepage and then the rest of the pages in http? From reading previous Q&As on this topic it seems as though people felt you shouldn't have https and http pages under the same sitemap - I am a novice here so that's why I'm just looking for advice.
Is there any reason why we would need to have the two sitemaps available - as in, why wouldn't we just remove the old http sitemap (that didn't include the https homepage) and just go with the https homepage sitemap?
I just wanted to make sure I understood your response before we take action.
Cheers,
-G
-
Hey G!
You can serve your sitemap in both versions, that won't be any problem and won't trigger the duplicate content issue. So you are safe both ways.
As for the second question: Yes, you should, unless you don't want your pages indexed (any HTTP or HTTPS). I think I saw your site before, and if I remember correctly you had your homepage and login script under SSL, right? Then you should definitely include your homepage in the sitemap but you can leave the login script file out as you don't need that indexed nor google will index it either.
Once you have your sitemap ready, consider including a path in the robots file, like this:
User-agent: *
Sitemap: http://[your website address here]/sitemap.xmlHope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Pending Sitemaps
Hi, all Wondering if someone could give me a pointer or two, please. I cannot seem to get Google or Bing to crawl my sitemap. If I submit the sitemap in WMT and test it I get a report saying 44,322urls found. However, if I then submit that same sitemap it either says Pending (in old WMT) or Couldn't fetch in the new version. This couldn't fetch is very puzzling as it had no issue fetching the map to test it. My other domains on the same server are fine, the problem is limited to this one site. I have tried several pages on the site using the Fetch as Google tool and they load without issue, however, try as I may, it will not fetch my sitemap. The sitemapindex.xml file won't even submit. I can confirm my sitemaps, although large, work fine, please see the following as an example (minus the spaces, of course, didn't want to submit and make it look like I was just trying to get a link) https:// digitalcatwalk .co.uk/sitemap.xml https:// digitalcatwalk .co.uk/sitemapindex.xml I would welcome any feedback anyone could offer on this, please. It's driving me mad trying to work out what is up. Many thanks, Jeff
Intermediate & Advanced SEO | | wonkydogadmin0 -
Spotify XML Sitemap
All, Working on an SEO work up for a Spotify site. Looks like they are using a sitemap that links to additional pages. A problem, none of the links are actually linked within the sitemap. This feels like a strong error. https://lubricitylabs.com/sitemap.xml Thoughts?
Intermediate & Advanced SEO | | dmaher0 -
Sitemap generator which only includes canonical urls
Does anyone know of a 3rd party sitemap generator that will only include the canonical url's? Creating a sitemap with geo and sorting based parameters isn't the most ideal way to generate sitemaps. Please let me know if anyone has any ideas. Mind you we have hundreds of thousands of indexed url's and this can't be done with a simple text editor.
Intermediate & Advanced SEO | | recbrands0 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640