How can I best find out which URLs from large sitemaps aren't indexed?
-
I have about a dozen sitemaps with a total of just over 300,000 urls in them. These have been carefully created to only select the content that I feel is above a certain threshold.
However, Google says they have only indexed 230,000 of these urls. Now I'm wondering, how can I best go about working out which URLs they haven't indexed? No errors are showing in WMT related to these pages.
I can obviously manually start hitting it, but surely there's a better way?
-
There's no obvious function in WM tools, but having a look round there's this option:
http://www.aspfree.com/c/a/BrainDump/Extracting-Google-Indexed-Web-Site-Pages-Using-MS-Excel/
But Google will only display the first 1000 URLs on a site query so you would need to adapt it lots of times. From the looks of it there's not an easy way.
There's maybe a tool out there that is similar to Xenu, but checks the index status in Google also. I haven't ever had the need for this so I'm not aware of one, but the chances are there is something out there.
Good luck!
-
Any ideas on how to go about exporting indexed urls?
-
Hi Peter,
I'd attempt some sort of export of both indexed URLs and actual URLs into an Excel file and try and remove duplicates.
You would need to look into it but I'm sure there's a way of matching and removing duplicates.
Other than that I wouldn't know.
Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
Site's IP showing WMT 'Links to My Site'
I have been going through, disavowing spam links in WMT and one of my biggest referral sources is our own IP address. Site: Covers.com
Technical SEO | | evansluke
IP: 208.68.0.72 We have recently fixed a number of 302 redirects, but the number of links actually seems to be increasing. Is this something I should ignore / disavow / fix using a redirect?0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
What's the best canonicalization method?
Hi there - is there a canonicalization method that is better than others? Our developers have used the
Technical SEO | | GBC0 -
Why Can't I Get on Google?
I've employed many of the suggestions of SEOMoz and getting a Grade "A" on a particular keyword. I'm now #4 on Yahoo and Bing. However, my site hasn't cracked the top 50 in Google. Why? I see a similar pattern with other keywords, many on yahoo and bing but only a few of my subpages get #45-48 on Google. Any ideas? http://www.gospelebooks.net
Technical SEO | | mrjgardiner0 -
Does it matter that our cached pages aren't displaying style
We've got pages that, when I search for them in Google and click on Cache, show NO styles, nothing from the CSS. Is there any way that could effect rankings? I don't think so, but it does fall into the category of showing one thing to the bots and another to the user, which is bad. Also, could blocking /scripts in robots.txt be preventing bots from accessing the CSS? Thanks
Technical SEO | | poolguy0 -
I have a penalized site and don't know what the cause is
I have a site which appears to have a Google indexation penalty. According to Google because its violating the T/Cs. Here are some background details about the site: The site is a online poker + deposit methods related site on a .co.uk TLD. It has 30+ uniquely written pages, and no advertising at the moment. In June of 2010, June 10 to be precisely, I bought this site from a fellow webmaster/affiliate. After the site 's ownership changed I tried accessing the server, but I couldn't log into it . I noticed that this host had serious problems and the IP was unreachable. After trying for some time the previous owner got me all the content in Word files and I created a new hosting account and re-launched the site on June 28. Between a couple of days after June 10 and June 28, the site was unreachable, and completely de-indexed from Google. When I re-launched the site, I used the default Wordpress Template Twenty Ten, and created new pages with the Word files I received from the previous owner. I waited a bit, but noticed the site didn't get re-indexed. So on August 18th I moved the content of domain xxx.com to yyy.co.uk/xxx/ and 301-ed all the former locations, hoping that this might help yyy.co.uk get indexed..... but nothing. On October 28 of 2010 I submitted my first reconsideration request, which was processed on November 17th without any change. At that time Google didn't say if anything was wrong like now, so I just waited... and waited... and waited some more. At some point I was ready to let this one go, as I didn't/don't see any problems with it. In fact, it used to be indexed before. By now, I removed all links pointing to it that I had control off, and there are hardly any left over. The site as well doesn't have any outgoing links left, so that can't be it either. I also removed a kind-a duplicate keyword heavy menu from the sidebar, as well as the widgets from the footer. Finally I also fixed a problem caused by Yoast Wordpress SEO Plugin, but I only installed this plugin recently, so that could not be the problem that caused the penalty. So after another reconsideration request Google again let me know this site still has issues, but I really have no clue which, or how to find out. I don't feel like doing any work on this site, as there is no guarantee that it will ever lose its penalty. What should I do now?
Technical SEO | | VisualSense0