Sitemap use for very large forum-based community site
-
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it:- Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs.
- Sitemap contains the above but is also dynamically updated with new threads & blog posts.
Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance. -
Agreed, you'll likely want to go with option #2. Dynamic sitemaps are a must when you're dealing with large sites like this. We advise them on all of our clients with larger sites. If your forum content is important for search then these are definitely important to include as the content likely changes often and might be naturally deeper in the architecture.
In general, I'd think of sitemaps from a discoverability perspective instead of a ranking one. The primary goal is to give Googlebot an avenue to crawl your sites content regardless of internal linking structure.
-
Hi
Go with option 2, there is no scaling issue here. I have worked with and for sites that have a high multiplier on the number of sitemaps and pages that they're submitting, in some cases up to 100M pages. In all cases, Google was totally fine in crawling and processing the data that was there. As long as you follow the guidelines (max 50K URLs in a sitemap) you're fine as you're just providing another file that usually doesn't exceed about 50MB (depending on if you also add images to the sitemap). If you have an engineering team build the right infrastructure you can easily deal with thousands of these files and run them automated every day/week.
My main focus on big sites is also to streamline their sitemaps to have sitemaps with just the last 50.000 pages and the same for the last 50.000 pages that were updated. This way you're able to also monitor the indexation level of these pages. If you are able to, for example, combine the data from log file analysis you can say: we added 50K pages and Google in the last days were able to crawl X percentage of that.
Hope this gives you some extra insights.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Tools Do I USe To Find Why My Site No Longer Ranks
Hi, I made the mistake of hiring a freelancer to work on my website [in2town.co.uk](link url) but after having a good website things went from bad to worse. The freelancer was kicked off the platform due to lots of compliants from people and creating backdoors to websites and posting on them. It cost me money to have the back door to our site closed. I then found lots of websites were stealing my content through the rss feed. Two of those sites have now been shut down by their hosting company. With all these problems I found in Feb that the hundreds of keywords that I ranked for had vanished. And all the ones that were in the top ten for many years have also vanished. When I create an article which includes https://www.in2town.co.uk/skegness-news/lincolnshire-premier-inn-staff-fear-for-their-jobs/ they cannot be found in Google. Normally before all these problems, my articles were found straight away. If I put in the title name Lincolnshire Premier Inn Staff Fear For Their Jobs and then add In2town in front of it, then instead of the page coming up with the article, it instead shows the home page. Can anyone please advise what tools i should be using to find out the problems and solve them, and can anyone offer advice please on what to do to solve this.
Technical SEO | | blogwoman10 -
Site redesign makes Moz Site Crawl go haywire
I work for an agency. Recently, one of our clients decided to do a complete site redesign without giving us notice. Shortly after this happened, Moz Site Crawl reported a massive spike of issues, including but not limited to 4xx errors. However, in the weeks that followed, it seemed these 4xx errors would disappear and then a large number of new ones would appear afterward, which makes me think they're phantom errors (and looking at the referring URLs, I suspect as much because I can't find the offending URLs). Is there any reason why this would happen? Like, something wrong with the sitemap or robots.txt?
Technical SEO | | YYSeanBrady1 -
Does my "spam" site affect my other sites on the same IP?
I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it. While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me. Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues? Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
Technical SEO | | eglove0 -
Site not loading on Firefox
Hello guys, I can't get my website to be loaded on Firefox, why's that?
Technical SEO | | PremioOscar0 -
Site Map
For a long time our site map used to be http://www.efurniturehouse.com/sitemap.xml recently our hosting company changed the site map to: http://www.efurniturehouse.com/xml-sitemap.ashx I went ahead and submitted the new site maps to both Google Webmaster and Bing. I submitted the Google one on Monday and it states PENDING. ( A day later this pending) I just submitted the map to Bing. I now have 2 site maps on each. 1)Is having 2 a problem Will they ignore the old site map or can we delete and if so when can we delete I appreciate your input Regards Tony www.eFurnitureHouse.com
Technical SEO | | OCFurniture0 -
What tool can i use to get the true speed of my site
hi, i am trying to get the true speed of my site. i want to know how fast www.in2town.co.uk is but the tools that i am using are giving me different readings. http://tools.pingdom.com/fpt/#!/DkHoNWmZh/www.in2town.co.uk says the speed is 1.03s http://gtmetrix.com/reports/www.in2town.co.uk/i4EMDk34 says my speed is 2.25s and http://www.vertain.com/m.q?req=cstr&reqid=dAv79lt8 says it is 4.36s so as you can see i am confused. I am trying to get the site as fast as possible, but need to know what the correct speed is so i can work on things that need changing to make it faster. can anyone also let me know what speed i should be working for. many thanks
Technical SEO | | ClaireH-1848860 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0