Sitemap generator partially finding list of website URLs
-
Hi everyone,
When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt.
Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap.
Thanks!
-
Gaston,
Interestingly enough by default the generator only located only half of the URLs. I hope that one of those 2 fields will do the trick.
-
Hi Taysir,
I´ve never used that service. I suspect that the section you refer to should do the trick.
I believe that you do know how many URLs there are in the whole site, so you can compare how much pro-sitemaps.com finds to your numbers.Best luck!
GR -
Thanks for your response Gaston. These pages are definitely not blocked by the robots.txt file. I think that it is an internal linking problem. I actually subscribed to pro-sitemap.com and was wondering if I should use this section and add remaining sitemap URLs that are missing: https://cl.ly/0k0t093f0Y1T
Do you think this would do the trick?
-
Google not only provides a basic template you could do the sitemap manually if you wished, and this link has Google listing several dozen open source sitemap generators.
If Google Webmaster's can't read the one you generated fully, then clearly an alternate generator should definitely fix that for you. Good luck!
-
Hi taysir!
Have you tried any other crawler to check whether those pages can be finded?
I'd strongly suggest you Screaming Frog spider, the free version allows you up to 500 URLs. Also, it has a feature to create sitemaps from the crawled URLs. Even though dont know if that available in the free version.
Here some info about that feature: XML sitemap genetator - Screaming FrogUsual issues in not being findable are:
- Poor internal linking
- Not having a sitemap (this is why you find out)
- Blocked resources in robots.txt
- Blocked pages with robots meta tag
That being said, its completely normal that Google has indexed pages that you cant find in a AdHoc crawl, that is because GoogleBot could have found those pages from external linking.
Also keep in mind that having pages blocked with Robots.txt or robots meta tag will not prevent that page from being indexed nor will make them deindex if you add some rules to block them.Hope it helps.
Best luck
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, not submitted in sitemap
I have this problem for the site's blog
Technical SEO | | seomozplan196
There is no problem when I check the yoast plugin setting , but some of my blog content is not on the map site but indexed. Did you have such a problem? What is the cause? my website name is missomister1 -
URL / sitemap structure for support pages
I am creating a site that has four categories housed in folders off of the TLD. Example: example.com/category-1
Technical SEO | | InterCall
example.com/category-2
example.com/category-3
example.com/category-4 Those category folders contain sub-folders that house the products inside each category. Example: example.com/category-1/product-1
example.com/category-2/product-1
etc. Each of the products have a corresponding support page with technical information, FAQs, etc. I have three options as to how to structure the support pages' URLs. Option 1 - Add new sub-folder with "support" added to string: example.com/category-1/product-1-support Option 2 - Add a second sub-folder off of the product sub-folder for support: example.com/category-1/product-1/support Option 3 - Create a "support" folder with product sub-folders: example.com/support/product-1 Which of these three options would you choose? I don't like having one large /support folder that houses all products. It seems like this would create a strange crawling and UX situation. The sitemap would have a huge /support folder with all of my products in it and the keywords in my category folders would be replaced with the word "support." Because I would rather have the main product pages ranking over any of the support pages (outside of searches containing the word "support"), I am leaning toward Option 2: example.com/category-1/product-1/support. I think this structure indicates to crawlers that the more important page is the product page, while the support page is secondary to that. It also makes it clear to users that this is the support page for that particular product. Does anyone have any experience or perspective on this? I'm open to suggestions and if I'm overthinking it, tell me that too. Thanks, team.0 -
SEO for Parallax Website
Hi, Are there any implications of having a parallax website and the URL not changing as you scroll down the page? So basically the whole site is under the same URL? However, when you click on the menu the URL does change? Cheers
Technical SEO | | National-Homebuyers0 -
Question about construction of our sitemap URL in robots.txt file
Hi all, This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file: http://www.ccisolutions.com/sitemap.xml As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file? Thanks! Dana
Technical SEO | | danatanseo0 -
Wordpress multilanguage sitemaps
Hi, I have a multilingual wordpress site. which is in Bulgarian and English - translated using qtranslate. The xml sitemap of the 2 languages is in one sitemap file- all the links for the Bulgarian and English version are in one file. (Our web is using this plugin - http://wordpress.org/extend/plugins/google-xml-sitemaps-v3-for-qtranslate Do you have any idea how can I make separate xml sitemap for every language? I ask you here because may be you have identical problems with your multilanguage wordpress website. You can see the sitemap with 2 languages links in one sitemap here: http://cholakovit.com/ sitemap.xml Cholakov IT I have read from this article that it is better practise and also it will help with geo-targetazing your web site: http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
Technical SEO | | vladokan0 -
Special characters in URL
Hi There, We're in the process of changing our URL structure to be more SEO friendly. Right now I'm struggling to find a good way to handle slashes that are part of a targeted keyword. For example, if I have a product page and my product title is "1/2 ct Diamond Earrings in 14K Gold" which of the following URLs is the right way to go if I'm targeting the product title as the search keyword? example.com/jewelry/1-2-ct-diamond-earrings-in-14k-gold example.com/jewelry/12-ct-diamond-earrings-in-14k-gold example.com/jewelry/1_2-ct-diamond-earrings-in-14k-gold example.com/jewelry/1%2F2-ct-diamond-earrings-in-14k-gold Thanks!
Technical SEO | | Richline_Digital0 -
Removing pages from website
Hello all, I am fairly new to the SEOmoz community. But i am working for a company which organizes exhibitons, events and training in Holland. A lot of these events are only given ones ore twice and then we do not organise them any more because they are no longer relevant. Every event has its own few webpages which provide information about the event and are being indexed by Google. In the past we did not remove any of these events. I was looking in the CMS and saw a lot of events of 2008 and older which are being indexed. To clean the website and the CMS i am thinking of removing these pages of old events. The risk is that these pages have some links to them and are getting some traffic, so if i remove them there is a risk of losing traffic and rankings. What would be the wise thing to do? Make a folder with archive or something? Regards, Ruud
Technical SEO | | RuudHeijnen0 -
How to configure mobile website?
Hi, Please tell me how to make the mobile website availabe: htp://m.mysite.com or http://www.mysite.com i.e. to render different content based on user agent but on the same URL
Technical SEO | | IM_Learner0