Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemaps: Best Practice
-
What should and what shouldn't go in the sitemap?
In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs?
Thanks for any advice/ anecdotes

-
So, sometimes, people think adding a sitemap to their company website, is something thats very difficult to do.
for example, they may think they need a web designer to do this for them, yet often you can do it yourself, its very simple.
so if your business has a WordPress website, then it can be a piece of cake to add a site map.
If you use Yoast, its a free plugin, , you can add a site map very easily to your website, which you can then send to your site map to Google Search Console for indexing .
We did this for a large garden room company within the city of Bristol, and what happens is that it makes sure every single page and blog post is indexed.
-
Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank
I would include core addresses
I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&".
There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs)
Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking
I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other!
No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either
If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files
Hope that helps!
-
To further on from this, we have some parameter urls in our sitemap which make me uneasy. should url.com/blah.html?option=1 be in the sitemap? If so, what benefit is that giving us?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO for table in mobile view
I'm wondering what the best way to present a table for mobile view in terms of SEO? It's a complicated table (not simple rows & columns but also col spans) which doesn't work with any responsive techniques I can find. I can offer different content for desktop / mobile so desktop is OK. But what's the best way forward with Google for mobile? I could offer a jpg or simply an explanation to revisit the page on desktop, but neither of those options seem particularly Google-friendly?
Intermediate & Advanced SEO | | Ann640 -
Bad SEO Practice: in title tag?
Greetings, I just discovered that some of our content was produced with
Intermediate & Advanced SEO | | Eric_Lifescript
tags in the title tag. Example: <title>Diabetes Symptoms <br> In Women Over 40</title> My gut says this is bad for SEO, but I couldn't find a definitive answer on the web, so I thought I would ask the community of gurus here at Moz. 🙂 Thanks in advance for any reply. Kind regards, Eric0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
What is the best way to find related forums in your industry?
Hi Guys, Just wondering what is the best way to find forums in your industry?
Intermediate & Advanced SEO | | edward-may2 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Sitemap on a Subdomain
Hi, For various reasons I placed my sitemaps on a subdomain where I keep images and other large files (static.example.com). I then submitted this to Google as a separate site in Webmaster tools. Is this a problem? All of the URLs are for the actual site (www.example.com), the only issue on my end is not being able to look at it all at the same time. But I'm wondering if this would cause any problems on Google's end.
Intermediate & Advanced SEO | | enotes0 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0