Sitemap use for very large forum-based community site
-
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it:- Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs.
- Sitemap contains the above but is also dynamically updated with new threads & blog posts.
Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance. -
Agreed, you'll likely want to go with option #2. Dynamic sitemaps are a must when you're dealing with large sites like this. We advise them on all of our clients with larger sites. If your forum content is important for search then these are definitely important to include as the content likely changes often and might be naturally deeper in the architecture.
In general, I'd think of sitemaps from a discoverability perspective instead of a ranking one. The primary goal is to give Googlebot an avenue to crawl your sites content regardless of internal linking structure.
-
Hi
Go with option 2, there is no scaling issue here. I have worked with and for sites that have a high multiplier on the number of sitemaps and pages that they're submitting, in some cases up to 100M pages. In all cases, Google was totally fine in crawling and processing the data that was there. As long as you follow the guidelines (max 50K URLs in a sitemap) you're fine as you're just providing another file that usually doesn't exceed about 50MB (depending on if you also add images to the sitemap). If you have an engineering team build the right infrastructure you can easily deal with thousands of these files and run them automated every day/week.
My main focus on big sites is also to streamline their sitemaps to have sitemaps with just the last 50.000 pages and the same for the last 50.000 pages that were updated. This way you're able to also monitor the indexation level of these pages. If you are able to, for example, combine the data from log file analysis you can say: we added 50K pages and Google in the last days were able to crawl X percentage of that.
Hope this gives you some extra insights.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Can an AJAX framework (using HTML5 + pushstate) on your site impact your ranking?
Hello everybody, I am currently investigating a website which is rendered by an AJAX Framework (Angularjs) using the HTML5 +API history - Pushstate methods.
Technical SEO | | Netsociety
Recently Google announced that they are able to execute Javascript and can therefore see the content and links to discover all pages in the structure. However it seems that it doesn't run the Javascript at ALL times. (after some internal testing) So technically it is possible it arrives on a page without seeing any content and links, while another time he can arrive, run Javascript and read/discover the content and links generated by AJAX.
The fact that Google can't always interpret or read the website correctly can therefore have negative SEO impact? (not the indexation process but ranking) We are aware that is better to create a snapshot of the page but in the announcement of Google they state that the method that is currently used, should be sufficient. Does anybody have any experience with this AND what is the impact on the ranking process? Thanks!0 -
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
Sitemap issue
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
Technical SEO | | Obbserv0 -
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
Is it best to create multiple xml sitemaps for different sections of a site?
I have a client with a very big site that includes a blog, videos, photo gallery, etc. Is it best to create a separate xml file for each of these sections? It seems to me like that would be the best way to keep everything organized. Or at least separate the blog out from the main site. Does anybody have any tips or recommendations? I'm not finding any good information about this.
Technical SEO | | MichaelWeisbaum0 -
Sitemaps for Google
In Google Webmaster Central, if a URL is reported in your site map as 404 (Not found), I'm assuming Google will automatically clean it up and that the next time we generate a sitemap, it won't include the 404 URL. Is this true? Do we need to comb through our sitemap files and remove the 404 pages Google finds, our will it "automagically" be cleaned up by Google's next crawl of our site?
Technical SEO | | Prospector-Plastics0