Status Errors generated from xml site map
-
I just ran a crawl test on our site and I'm seeing a lot of 404 errors that are referredt from the xml sitemap.. Anyone know how to fix it?
-
If it is automatically generated, then you should bring this up to your Dev team and have them investigate. One of the biggest issues I see is auto generating sitemaps daily.
Good Luck
-
Thanks for the tips. The idea of different sitemaps is a good one... I know our file is automatically generated so that's why I'm a but confused.
-
If you have ftp access to the site, you should be able to download the xml sitemap and edit it in a text editor program.
Since you have several bad links, I would probably delete them all and create a new one based on your current website. (And if it is a large site, you can have multiple xml sitemaps to keep everything organized and better for you to monitor and track)
There are some great tools out there that will automatically create your sitemaps for you.
-
I guess I was asking how to fix the sitemap. I'm a newbie at this so learning by fire.. Thanks for the Webmaster Tools reco... I'll check it out.
-
I must not be understanding your question, since my answer would be fix your XML Sitemap? Google Webmaster Tools has a tool that allows you to validate your sitemap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big changes in site titles
So as I pour though some of the diagnostics data for over 100,000 pages of my site I see thousands of page title that "could" be changed. Could this cause some lost traffic for a while due to the big changes?
Moz Pro | | dvduval0 -
Open Site Explorer WAY Off in Terms of Link Profiles?
Hey, One of our websites is www.inspireeducation.net.au. I have noticed although tools like Raventools capture our links well, Open Site Explorer is doing a terrible job... For example the following page >>> http://www.inspireeducation.net.au/courses/training-and-assessment-courses/certificate-iv-in-training-and-assessment/ has many many more than 8 root domains linking, however Open Site Explorer only presents 8? We are finding the same problem for almost any page we review through Open Site. Does anyone have any idea why the numbers would be so out? The new links are NOT fresh links. Many are well-established (been there for years), and even many newer ones have been there for more that 60 days. I find the same thing when reviewing competitor sites.. is Open Site Explorer working properly at all at the moment?
Moz Pro | | love-seo-goodness0 -
How to track sub-directories as separate sites
I am tracking a blog for a client thats hosted on a subdomain, we decided to give it it's own SEOMoz campaign. My issue is that when entering competitors, if the competing blog is using a sub-directory, SEO Moz won't let me enter it, only the domain. I would assume that there has to be a way to do this, but I don't see it. I need to be able to track the competing blogs as separate from their root domain. mysite: blog.mysite.com their site: theirsite.com/blog (very different from just theirsite.com) Hopefully someone can tell me what to do here because the metrics are basically useless if I am forced to use only the domain. Thanks so much!
Moz Pro | | Ascedia0 -
Is Keyword Difficulty an absolute measure, or relative to my site?
We were able to rank very well for a specific keyword. After signing seomoz I've figured out that this keyword has a difficulty of 1%. All the other similar keywords I've researched have difficulties greater than 20%. Is the Difficulty related to my site? Or is it absolute?
Moz Pro | | BrunoReis0 -
Lots of site errors after last crawl....
Something interesting happened on the last update for my site on SEOmoz pro tools. For the last month or so the errors on my site were very low, then on the last update I had a huge spike in errors, warnings, and notices. I'm not sure if somehow I made a change to my site (without knowing it) and I caused all of these errors, or if it just took a few months to find all the errors on my site? My duplicate page content went from 0 to 45, my duplicate page titles went from 0 to 105, my 4xx (client error) went from 0 to 4, and my title missing or empty went from 0 to 3. On the warnings sections my missing meta description tag went form a hand full to 444. (most of these looking to be archive pages.) Down in the notices I have over 2000 that are blocked by meta robots, meta-robots nofollow, and Rel canonical. I didn't have any where near this many prior to the last update of my site. I just wanted to see what I need to do to clean this up, and figure out if I did something to cause all the errors. I'm assuming the red errors are the first things I need to clean up. Any help you guys can provide would be greatly appreciated. Also if you'd like me to post any additional information, please let me know and I'd be glad to.
Moz Pro | | NoahsDad0 -
How to force a recrawl of a site?
Hi, I made changes in my site. I would like to see the result of the crawl diagnostic. I know the crawl is happening every week, however, is there a way to force a re-crawl in order not to have to wait 5 days? Cheers,
Moz Pro | | nuxeo0 -
Is there any way to view crawl errors historically?
One of the website's we monitor have been getting high duplicate page titles, as we work through the pages, we see changes and the number of duplicate page titles are decreasing. However, lately, it went up again and the duplicate page titles have increased. I wanted to ask if there's any way to view the new errors and the old errors separately or sorted in a way that can help me identify why we are getting new page crawl errors. Any advice would be great. Thanks!
Moz Pro | | TheNorthernOffice790 -
What causes Crawl Diagnostics Processing Errors in seomoz campaign?
I'm getting the following error when seomoz tries to spider my site: First Crawl in Progress! Processing Issues for 671 pages Started: Apr. 23rd, 2011 Here is the robots.txt data from the site: Disallow ALL BOTS for image directories and JPEG files. User-agent: * Disallow: /stats/ Disallow: /images/ Disallow: /newspictures/ Disallow: /pdfs/ Disallow: /propbig/ Disallow: /propsmall/ Disallow: /*.jpg$ Any ideas on how to get around this would be appreciated 🙂
Moz Pro | | cmaddison0