Sitemap For Static Content And Blog
-
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently.
How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed.
I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
-
Hi,
I'd recommend using a sitemap index. It allows you to address multiple sitemaps in GSC so you could have one for 'static pages' and another which generates for blog content.
-
If for your blog you are using something like wordpress, you can have multiple sitemaps. One for your blog and one for your static site. The blog would auto update whereas you can just manually update your static site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog? We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good. So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed. So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
Technical SEO | | eBoost-Consulting0 -
I Lost Index Status of My Sitemap
We have a simple WordPress website for our law firm, with an English version and a Spanish version. I have created a sitemap (with appropriate language markup in the XML file) and submitted it to Webmaster Tools. Google crawled the site and accepted the sitemap last week, 24/24 pages indexed, 12 English and 12 Spanish. This week, Google decided to remove one of the pages from the index, showing 23/24 pages indexed. So, my questions are as follows: How can I find out which page was dropped from the index? If the pages are the same content, but different language, why did only one version of the page get dropped, while the other version remains? Why did the Big G drop one of my pages from the index? How can I reindex the dropped page? I know this is a fairly basic issue, and I'm embarrassed for asking, but I sure do appreciate the help.
Technical SEO | | RLG0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Why I am a seeing an error for duplicate content for any categories and tags on my Wordpress blog?
When I look under "Crawl Diagnostics" I see I have 12 errors for duplicate content and there are all from tags and categories. I am assuming that search engines are reading the content in the tags and categories as duplicate. Should I set my categories to "no-index?"
Technical SEO | | brytewire0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
Where to put content on the page? - technical
The new algo update says any images at the top of the page negatively affect user experience if they are adverts? how does google know if its an advert or relevant banner? When trying to put text as far up as possible on the page, is it ok to make it appear higher in the code but appear further down using css? Or does Google not go from the code top to bottom when working this out, more how it renders? Any advice much appreciated.
Technical SEO | | pauledwards0 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0 -
Sitemaps - Format Issue
Hi, I have a little issue with a client site whose programmer seems kind of unwilling to change things that he has been doing a long time. So, he has had this dynamic site set up for a few years and active in google webmaster tools and others, but is not happy with the traffic it is getting. When I looked at webmaster tools I see that he has a sitemap registered, but it is /sitemap.php When I said that we should be offering the SE's /sitemap.xml his response is that sitemap.php checks the site every day and generates /sitemap.xml, but there is no /sitemap.xml registered in webmaster tools. My gut is telling me that he should just register /sitemap.xml in webmaster tools, but it is a hard sell 🙂 Anyone have any definitive experience of people doing this before and whether it is an issue? My feeling is that it doesn't need to be rocket science... Any input appreciated, Sha
Technical SEO | | ShaMenz0