Sitemap generator partially finding list of website URLs
-
Hi everyone,
When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt.
Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap.
Thanks!
-
Gaston,
Interestingly enough by default the generator only located only half of the URLs. I hope that one of those 2 fields will do the trick.
-
Hi Taysir,
I´ve never used that service. I suspect that the section you refer to should do the trick.
I believe that you do know how many URLs there are in the whole site, so you can compare how much pro-sitemaps.com finds to your numbers.Best luck!
GR -
Thanks for your response Gaston. These pages are definitely not blocked by the robots.txt file. I think that it is an internal linking problem. I actually subscribed to pro-sitemap.com and was wondering if I should use this section and add remaining sitemap URLs that are missing: https://cl.ly/0k0t093f0Y1T
Do you think this would do the trick?
-
Google not only provides a basic template you could do the sitemap manually if you wished, and this link has Google listing several dozen open source sitemap generators.
If Google Webmaster's can't read the one you generated fully, then clearly an alternate generator should definitely fix that for you. Good luck!
-
Hi taysir!
Have you tried any other crawler to check whether those pages can be finded?
I'd strongly suggest you Screaming Frog spider, the free version allows you up to 500 URLs. Also, it has a feature to create sitemaps from the crawled URLs. Even though dont know if that available in the free version.
Here some info about that feature: XML sitemap genetator - Screaming FrogUsual issues in not being findable are:
- Poor internal linking
- Not having a sitemap (this is why you find out)
- Blocked resources in robots.txt
- Blocked pages with robots meta tag
That being said, its completely normal that Google has indexed pages that you cant find in a AdHoc crawl, that is because GoogleBot could have found those pages from external linking.
Also keep in mind that having pages blocked with Robots.txt or robots meta tag will not prevent that page from being indexed nor will make them deindex if you add some rules to block them.Hope it helps.
Best luck
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site is generating long path URLs
Hi, We've seen recently in Search Console Coverage report that website is generating long path URLs that we actually don't have.
Technical SEO | | eUniverse
Here's an example: https://autocovers.co.uk/car-mats/outdoor-basic/indoor-car-covers/shop/contact-us/shipping-delivery/about-us/about-us/indoor-car-covers/ Does anybody knows what's the issue behind it? Thanks!0 -
Google selecting incorrect URL as canonical: 'Duplicate, submitted URL not selected as canonical'
Hi there, A number of our URLs are being de-indexed by Google. When looking into this using Google Search Console the same message is appearing on multiple pages across our sites: 'Duplicate, submitted URL not selected as canonical' 'IndexingIndexing allowed? YesUser-declared canonical - https://www.mrisoftware.com/ie/products/real-estate-financial-software/Google-selected canonical - https://www.mrisoftware.com/uk/products/real-estate-financial-software/'Has anyone else experienced this problem?How can I get Google to select the correct, user-declared canoncial? Thanks.
Technical SEO | | nfrank0 -
Clean URL vs. Parameter URL and Using Canonical URL...That's a Mouthfull!
Hi Everyone, I a currently migrating a Magento site over to Shopify Plus and have a question about best practices for using the canonical URL. There is a competitor that I believe is not doing it the correct way, so I want to make sure my way is the better choice. With 'Vendor Pages' in Shopify, they show up looking like: https://www.campusprotein.com/collections/vendors?q=Cellucor. Not as clean. Problem is that Shopify also creates https://www.campusprotein.com/collections/cellucor. Same products, same page, just a different more clean URL. I am seeing both indexed in Google. What I want to do is basically create a canonical URL from the URL with the parameter that points to the clean URL. The two pages are very similar. The only difference is that the clean URL page has some additional content at the top of the page. I would say the two pages are 90% the same. Do you see any issue with that?
Technical SEO | | vetofunk0 -
Can redirect URL website also shown on the google ranking? and higher than the original website?
can redirect URL website also shown on the google ranking? and higher than the original website? For example, I create URL B which redirect to website A, and do good SEO on URL B, can URL B rank higher than my original website A?
Technical SEO | | HealthmateForever0 -
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Sitemap question
Hello, In your opinion what is better for a root domain and micro-sites using sub-domains?, to have a single sitemap for the root domain including all links to the sub-domains or to have a separate sitemap for each sub-domain? Thanks Arnold
Technical SEO | | arnoldwender0 -
Slow website
Hi I have just migrated from a custom written php/mysql site to a site using wordpress and woocommerce. I couldnt believe the drop in speed . I am using a few plugins for wordpress - contact forms / social sharing. and I have a few woocommerce plugins for taking payment etc. I am hosting images css's and js's on W3 Total Cache and MAXCDN hoping to speed the site up but tools at http://tools.pingdom.com/fpt sometimes show that the time between browser request and reply can be between 1 and 15 secs. I have searched all day looking for a post I read about two months ago with a tool that seems to look at server responce and redirect processing etc hoping it would help but cant find it. If anyone knows what I am talking about I would appreciate them posting a link The site is http://www.synergy-health.co.uk and an example of an inner page is http://www.synergy-health.co.uk/home/shop/alacer-emergen-c-1000-mg-vitamin-c-acai-berry-30-packets-8-4-g-each/ Any suggestions please? Perhaps I have w3total cache set wrong? Also, as the has been tanked and is in freefal iin google ranking since January would this be a good time to change the structure of Url from home/shop/product to domain.name/brand/product? Thanks in advance !
Technical SEO | | StephenCallaghan0