Generating a Sitemap for websites linked to a wordpress blog
-
Greetings,
I'm looking for a way to generate a sitemap that will include static pages on my home directory, as well as my wordpress blog. The site that I'm trying to build this for is in a temporary folder, and can be accessed at http://www.lifewaves.com/Website 3.0
I plan on moving the contents of this folder to the root directory for lifewaves.com whenever we are go for launch.
What I'm wondering is, is there a way to build a sitemap or sitemap index that will point to the static pages of my site, as well as the wordpress blog while taking advantage of the built in wordpress hierarchy? If so, what's an easy way to do this. I have generated a sitemap using Yoast, but I can't seem to find any xml files within the wordpress folder. Within the plugin is a button that I can click to access the sitemap index, but it just brings me to the homepage of my blog.
Can I build a sitemap index that points to a sitemap for the static pages as well as the sitemap generated by yoast? Thank you in advance for your help!!
P.S. I'm kind of a noob.
-
It doesn't matter sitemaps are designed to make sure all your pages are crawled and improve the chances of indexing.
Happy Link building
-
Thanks, Chris. I was able to manually add my static pages in the root folder to the sitemap built by the google XML sitemap generator. This works out great since I can control their priority and other settings, including the location of the sitemap file itself.
One additional question is, within the sitemap, does the order of pages matter? In other words, if I want to drive more traffic to my static pages (not wordpress), should they come first in the sitemap, or does the priority ranking take care of this?
Thanks a million for all your help!!
-
Having more than one sitemap is not an issue for SEO.if anything it makes sure the search spiders can definitely find your content.
The Google XML sitemaps plugin normally creates a file in the Wordpress folder i.e..
yourdomain.com/wp/sitemap.xml but you can define your preferred location
When you publish a new blog or page the sitemap will automatically update.
-
Thanks, Chris. Does it hurt my site in terms of SEO to have two separate sitemaps? I know there's no real answer for this, but I'd like to have a grip of the pro's and con's.
Also, I my brain was heading down this path with the idea that I could tie the two sitemaps together using a sitemap index which would list both of them. Do you know the default location of the sitemap generated by Google XML Sitemaps, and if the file name changes when it is updated? I believe this is all the information I would need to put both in a sitemap index.
Thanks again!!
-
I use the Google XML Sitemaps plugin to automatically generate my sitemap for my blog. You can have more than 1 sitemap for a site so you could create a 2nd sitemap for the static pages & manually manage this as static pages are less likely to change. If the static pages are part of your Wordpress install then the plugin will pick these up too.
-
I just found this article by Yoast himself. Sounds like he ran into a few of the same hurdles you have and created a PHP script to help build a sitemap. http://yoast.com/xml-sitemap-php-script/
Hope that helps.
The only other option I would recommend is to build the sitemap yourself. I recently purchased http://www.xml-sitemaps.com/ based on some of the other Q&A threads in SEOmoz and used it for a large ecommerce site. It was a good fix to my problems. Maybe a help, maybe not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi! I first wrote an article on my medium blog but am now launching my site. a) how can I get a canonical tag on medium without importing and b) any issue with claiming blog is original when medium was posted first?
Hi! As above, I wrote this article on my medium blog but am now launching my site, UnderstandingJiuJitsu.com. I have the post saved as a draft because I don't want to get pinged by google. a) how can I get a canonical tag on medium without importing and b) any issue with claiming the UJJ.com post is original when medium was posted first? Thanks and health, Elliott
Technical SEO | | OpenMat0 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Will adding a mini directory to our blog with lots of outgoing no follow links harm our authority and context
We are an adventure travel tour company, who run hiking, kayaking, biking adventures in several countries. We have a travel tour operator website with a blog in a sub folder of the site. We want to add a section/category in the blog itself with a hiking club mini directory, that lists all hiking clubs in 1 or 2 specific countries. The reason we want to do this is because the people searching online for these clubs are our target market and potential clients. We hope to rank for some of these searches, and encourage interest in our blog/website in the process. We also want the potential to build relationships with these clubs. The question I want to ask is: if we add say 100 to 200 listings, and make all outgoing links no follow, will this harm our page authority, reputation with SE's or pose any other risk for our site. The other question is, do you think that this will dilute the context of our content - as its slightly different in context to the rest of our site content. Are we better to set up a separate site for this purpose.
Technical SEO | | activenz1 -
Multiple Sitemaps
Hello everyone! I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future. Am I allowed to do so? would that be a good idea? Open to suggestion 🙂
Technical SEO | | PremioOscar0 -
Wordpress: Should your blog posts be noindex?
Wordpress defaults all blog posts to no index/nofollow Is this how it should be handled? I understand the nofollow from the page.com/blog to the page.com/blog/blogtitle But why noindex? We have Yoast installed and this is the default.
Technical SEO | | cschwartzel0 -
Duplicate Titles on Wordpress blog pages
Hi, I have an issue where I am getting for duplicate page titles for pages that shouldn't exist. The issue is on the blog index page's (from 0 - 16) and involves the same set of attachment_id for each page, i.e. /blog/page/10/?attachment_id=minack /blog/page/10/?attachment_id=ponyrides /blog/page/11/?attachment_id=minack /blog/page/11/?attachment_id=ponyrides There are 6 attachment_id values (and they are not ID values either) which repeat for every page on the index now what I can't work out is where those 6 links are coming from as on the actual blog index page http://www.bosinver.co.uk/blog/page/10/ there are no links to it and the links just go to blog index page and it ignores the attachment_id value. There is no sitemap.xml file either which I thought might have contained the links. Thanks
Technical SEO | | leapSEO0 -
Keep getting random broken links pointed at our website
Over the past 6 months I've noticed an increase of 404 errors being picked up by Google Webmaster Tools. When I look at the errors table and review these errors the links are invariable made up likes being generated from what looks like scraper sites or sites with search results embedded into them. Below are some examples of them. Broken Link URL Generating the Link www.mysite.com/rwickenhauser http://www.webstatsdomain.com
Technical SEO | | adamlcasey
www.mysite.com/guest http://www.webstatsdomain.com
www.mysite.com/windows http://www.webstatsdomain.com
www.mysite.com/..app http://213.174.143.40/
www.mysite.com/ht.. http://de.aguea.com/ None of these pages exist on our website.
So do I need to setup redirects for these or can they be ignored?0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0