Sitemaps: Best Practice
-
What should and what shouldn't go in the sitemap?
In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs?
Thanks for any advice/ anecdotes
-
So, sometimes, people think adding a sitemap to their company website, is something thats very difficult to do.
for example, they may think they need a web designer to do this for them, yet often you can do it yourself, its very simple.
so if your business has a WordPress website, then it can be a piece of cake to add a site map.
If you use Yoast, its a free plugin, , you can add a site map very easily to your website, which you can then send to your site map to Google Search Console for indexing .
We did this for a large garden room company within the city of Bristol, and what happens is that it makes sure every single page and blog post is indexed.
-
Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank
I would include core addresses
I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&".
There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs)
Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking
I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other!
No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either
If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files
Hope that helps!
-
To further on from this, we have some parameter urls in our sitemap which make me uneasy. should url.com/blah.html?option=1 be in the sitemap? If so, what benefit is that giving us?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
Whats the best practice for internal links?
Hi our site is set up typically for a key product (money page) with 6 to 12 cluster pages, with a few more associated blog pages. If for example the key product was "funeral plans" what percentage of the internal anchor text links should be an exact match? Will the prominence of those links eg higher up the page have an impact on the amount of juice flowing? And do links in buttons count in the same way as on page anchor text eg "compare funeral plans"? Many thanks
Intermediate & Advanced SEO | | AshShep1
Ash1 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Sitemaps and dynamic pages
Hi all, I have a gigantic website and they are adding another subdirectory to it. My question is regarding html sitemaps for better optimisation. 1. Should a keyword focussed front end (html) sitemap be made for all the dynamic URLs or 2. Should a category focussed front end (html) sitemap be made for all the dynamic URLs what would be your approach to doing a sitemap with thousands of pages with a structure like Directory > Sub directory > Subdirectory > Files
Intermediate & Advanced SEO | | Malika10 -
How to come up with the best title tags?
Hi Guys, I know title tags is one of the most important things that google looks at, any tips or advice around how I can go about improving the ones I have currently? I'm ranking pretty decent for all domains, slowly but surely, my main keywords are online psychics, online psychic readings, chat psychics, chat readings, tarot readings, and psychic readings. Any advice would be much appreciated, or direct me to resources I can look into helping me get onto the right path. my website is http://bit.ly/1KTbWg0 Thanks
Intermediate & Advanced SEO | | edward-may0 -
To include in Sitemap or not to include?
Hello all, A bit of a confusing one but please bear with me... On our website we have a Used Cars section where each morning a feed is loaded onto our site with any changes to the stock. Some cars may have been sold and removed, some new cars may be added, some prices may be changed, every day every morning this very large section of our website is updated. The question I have is, should I be including these urls in my sitemap? The Used Cars section is a huge portion of our website content and is our most important area, the Used Cars overview is our most frequently visited page. The reason I ask is because of course Google might crawl and see car X, but tomorrow car X could be gone and be replaced with car Y. Should I be even mentioning these pages to Google if by tomorrow some of those urls could be gone? It's always changing and it's something we don't have control of. Thanks!
Intermediate & Advanced SEO | | HB170 -
Mobile Sitemap Best Practices w/ Responsive Design
I'm looking for insight into mobile sitemap best practices when building sites with responsive design. If a mobile site has the same urls as the desktop site the mobile sitemap would be very similar to the regular sitemap. Is a mobile sitemap necessary for sites that utilize responsive design? If so, is there a way to have a mobile sitemap that simply references the regular sitemap or is a new sitemap that has all urls tagged with the "" tag with each url required?
Intermediate & Advanced SEO | | AdamDorfman0 -
What are best SEO practices for product pages of unique items when the item is no longer available?
Hello, my company sells used cars though a website. Each vehicle page contains photos and details of the unit, but once the vehicle is sold, all the contents are replaced by a simple text like "this vehicle is not available anymore".
Intermediate & Advanced SEO | | Darioz
Title of the page also change to a generic one.
URL remains the same. I doubt this is the correct way of doing, but I cannot understand what method would be better. The improvement I am considering for pages of no longer available vehicles is this: keep the page alive but with reduced vehicle details, a text like: this vehicles is not available anymore and automatic recommendations for similar items. What do you think? Is this a good practice or do you suggest anything different? Also, should I put a NOINDEX tag on the expired vehicles pages? Thank you in advance for your help.0