Sitemap For Static Content And Blog
-
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently.
How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed.
I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
-
Hi,
I'd recommend using a sitemap index. It allows you to address multiple sitemaps in GSC so you could have one for 'static pages' and another which generates for blog content.
-
If for your blog you are using something like wordpress, you can have multiple sitemaps. One for your blog and one for your static site. The blog would auto update whereas you can just manually update your static site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some URLs in the sitemap not indexed
Our company site has hundreds of thousands of pages. Yet no matter how big or small the total page count, I have found that the "URLs Indexed" in GWMT has never matched "URLS in Sitemap". When we were small and now that we have a LOT more pages, there is always a discrepancy of ~10% or so missing from the index. It's difficult to know which pages are not indexed, but I have found some that I can verify are in the Sitemap.xml file but not at all in the index. When I go to GWMT I can "Fetch and Render" missing pages fine - it's not as though it's blocked or inaccessible. Any ideas on why this is? Is this type of discrepancy typical?
Technical SEO | | Mase0 -
Multilingual Blog Structure
Hi I have a domain in 20 languages. I want to integrate a wordpress blog (in subfolders) in the 3 most important languages like EN-ES-FR (actually they will be 3 independent blogs) and I want to know which structure is the best one. OPTION 1 domain/en/blog/post1 domain/es/blog/post1 domain/fr/blog/post1 OPTION 2 domain/blog_en/post1 domain/blog_es/post1 domain/blog_fr/post1 Last question. For the rest of the 17 languages of my domain, can I put a link the english blog or is not recommended because maybe too many pages will be linking to the blog? Thank you
Technical SEO | | andromedical0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Are Tags in Blogs good?
Hi Is adding tags into Blogs a good idea? Does it help with SEO at all? For example blog/?tag=/Invoicing-Software Will that help us get ranked for Invoicing Software? Regards Andrew
Technical SEO | | Studio330 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0 -
Dismal content rankings
Hi, I realize this is a very broad question, but I am going to ask it anyways in the hopes that someone might have some insight. I have created a great deal of unique content for the site http://www.healthchoices.ca. You can select a video category from the top dropdown, then click on a video beside the provider box to see. The articles I've written are accessible by the View Article tab under each video. I have worked hard to make the articles informative and they are all unique with quotes from expert physicians. Even for strange health conditions that don't have a lot of competition - I don't see us appearing. Our search results are quite dismal for the amount of content we have. I guess I'm checking to see if anyone is able to point me in the right direction at all? If anything jumps out... Thanks, Erin
Technical SEO | | erinhealthchoices0 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0