How do I fix my sitemap?
-
I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work.
Thanks,
Ruben
-
Hey Paul,
I'm attaching a screenshot of the "Location of your sitemap file." I think these settings - which I didn't change - are correct. But, if I need to change something here, please let me know.
Thanks for help! I really have no idea what I did to cause this problem, but I guess, as long as it gets fixed, that's all that matters.
Best,
Ruben
-
Before you dump the plugin, Ruben, check in the plugin's settings for whether the location has somehow inadvertently been changed.
Under the Settings link in your WP Dashboard sidebar, there should be an entry for XML Sitemap (the plugin's settings)
On the settings page, there should be a spot to define where the sitemap should be located. If there is, and that location has somehow defaulted to the category feed, you should be able to change the location designation back to the correct /sitemap.xml location. (In the XML Sitemap Generator plugin I suspect you're using, the settings are in the 5th section down titled Location of your sitemap file.)
Does that work?
Paul
-
It's a plugin, but it's not Yoast. We use All-in-one SEO, and then a separate one for the sitemap. However, that does help me. I could always just uninstall the sitemap plugin and redo it. That might work. Thank you all for helping me flush that out. I do appreciate it.
Best,
Ruben
-
The best way to deal with sitemaps on wordpress is to simply use a plugin. I can recommend the yoast seo plugin which has a built in sitemap feature or I have more recently been using the "google xml sitemap" plugin. Both will work fine. Both update automatically. So no need to update it manually. yoast creates a link to your sitemap link such (yourwebsite.com/sitemap-index.xml) and the other plugin creates one like this(yourwebsite.com/sitemap.xml). once your sitemap is created there they both offer a link to it and you can copy the link and submit it to GWT.
I don't really know how your sitemap got messed up but i hope that the info above can make your life a little easier down the road. if you decide to use one of the plugins, you won't have to worry about that problem again.
Good luck.
-
It looks like your sitemap.xml has been moved from that location or deleted. Did you create this manually or are you using a plugin to create the sitemap.xml? If so which one?
-
Hi Guys,
I personally believe in updating a sitemap often so that any changes made in your website are reflected in you Google sitemap which keeps them up to date, and you will see Google's spiders hitting your site for the new data.
Use a sitemap generator like Screaming frog which may be downloaded for free here: http://www.screamingfrog.co.uk/seo-spider/
Generate your new sitemap.xml file (which is something you could do on a regular basis), then upload it to a place of your choosing. Now in Google Webmaster Tools list your new sitemap.xml file and your off and running again. A very simple procedure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Is it worth creating an Image Sitemap?
We've just installed the server side script 'XML Sitemaps' on our eCommerce site. The script gives us the option of (easily) creating an image sitemap but I'm debating whether there is any reason for us to do so. We sell printer cartridges and so all the images will be pretty dry (brand name printer cartridge in front of a box being a favourite). I can't see any potential customers to search for an image as a route in to the site and Google appears to be picking up our images on it's own accord so wonder if we'll just be crawling the site and submitting this information for no real reason. From a quality perspective would Google give us any kind of kudos for providing an Image Sitemap? Would it potentially increase their crawl frequency or, indeed, reduce the load on our servers as they wouldn't have to crawl for all the images themselves?
Intermediate & Advanced SEO | | ChrisHolgate
I can't stress how little of a hardship it will be to create one of these automatically daily but am wondering if, like Meta Keywords, there is any benefit to doing so?1 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
Should pages with rel="canonical" be put in a sitemap?
I am working on an ecommerce site and I am going to add different views to the category pages. The views will all have different urls so I would like to add the rel="canonical" tag to them. Should I still add these pages to the sitemap?
Intermediate & Advanced SEO | | EcommerceSite0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
What is the practical influence of priority in a sitemap?
I have a directory site with 1000s of entries. Will there be benefit to be gained from playing with various entries priorities in the sitemap? I was thinking I might give more priority to entries that have upgraded their directory entry. Thanks.
Intermediate & Advanced SEO | | flow_seo0 -
Where to get a video sitemap creator for Wordpress?
I have a website that is nearly all about videos and is based on Wordpress. Does anyone know of a way to create a video sitemap that updates automatically as I write a new post? The video files and other data are all stored in separate meta-post locations... So it needs to be able to grab them. Any help is appreciated.
Intermediate & Advanced SEO | | DojoGuy0