Sitemap
-
I have a question for the links in a sitemap. Wordpress works with a sitemap that first link to the different kind of pages:
- pagesitemap.xml
- categorysitemap.xml
- productsitemap.xml
- etc. etc.
These links on the first page are clickable. We have a website that also links to the different pages but it's not clickable, just a flat link. Is this an issue?
-
Those links are made into hrefs (eg made clickable) by xml styling that's been applied. It's purely for user convenience - doesn't matter to search engines either way.
There is actually a massive benefit to having multiple sub-sitemaps like that though. Once you've submitted the sitemap index to Google Search Console, it will break out the crawling and indexing report for each sub-sitemap. Which means you will now be able to monitor and asses each of different sections of your site separately. Vastly easier to detect and fix crawl errors that way than when everything's lumped into a single sitemap.
Paul
-
This isn't an issue, and most sites don't provide clickable links to their sitemap/s.
I would recommend adding your sitemap URLs to Google's Search Console though, to help Google crawl your site more efficiently.
-
Hello,
WordPress has a habit of breaking sitemaps into categories, for me this is unnecessary as a normal sitemap can hold 50k of urls. In each of those categories is a smaller sitemap which is working as it should, if you are worried you can check the sitemap status in your search engine console - formal webmaster tools. One last fun one, there are two kinds of sitemap one for users and one for bots, the .xml kind is for bots if you were curious its not normal to click on a sitemap as seen here- https://moz.com/sitemap.xml (that is also directing to other sitemaps but thats per sides the point)
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advice for rapidly declining ranking-- can an old indexed sitemap cause this?
Hi Everyone, Today, I woke up to a dramatic page rank decline (nearly 20 positions) for a client's website (eacoe.org). When I looked in Webmaster tools, I noticed that the site was just indexed yesterday by Google (a request that the webmaster had submitted back in April of this year). Would this re-indexing event have caused the sharp decline? In Webmaster Tools, I don't see many errors (one 404 error that we are planning on fixing). I likewise see no Manual Actions/ penalties brought up by Google about our site. My first concern is that the re-indexing led to rank decline, but I'm not entirely sure if I should be focusing on something else. And if it is the re-indexing, what are there any recommended steps of attack? Thanks for your help! -Bruce
Technical SEO | | dynedge0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Sitemap issue
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
Technical SEO | | Obbserv0 -
Sitemap errors have disappeared from my Google Webmaster tools
Hi all, A week ago I had 66 sitemap errors related to href langs in my GWT. Now, all the errors are gone, and it shows no errors. We have not done any work to fix the errors. I wonder if anybody has experienced the same thing, of Google suddenly changing the criteria or the way they report on errors in Google Webmaster Tools. I would appreciate any insights from the community! Best regards Peru
Technical SEO | | SMVSEO0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Question about construction of our sitemap URL in robots.txt file
Hi all, This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file: http://www.ccisolutions.com/sitemap.xml As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file? Thanks! Dana
Technical SEO | | danatanseo0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
How do i Organize an XML Sitemap for Google Webmaster Tools?
OK, so i used am xlm sitemap generator tool, xml-sitemaps.com, for Google Webmaster Tools submission. The problem is that the priorities are all out of wack. How on earth do i organize it with 1000's of pages?? Should i be spending hours organizing it?
Technical SEO | | schmeetz0