How can I best find out which URLs from large sitemaps aren't indexed?
-
I have about a dozen sitemaps with a total of just over 300,000 urls in them. These have been carefully created to only select the content that I feel is above a certain threshold.
However, Google says they have only indexed 230,000 of these urls. Now I'm wondering, how can I best go about working out which URLs they haven't indexed? No errors are showing in WMT related to these pages.
I can obviously manually start hitting it, but surely there's a better way?
-
There's no obvious function in WM tools, but having a look round there's this option:
http://www.aspfree.com/c/a/BrainDump/Extracting-Google-Indexed-Web-Site-Pages-Using-MS-Excel/
But Google will only display the first 1000 URLs on a site query so you would need to adapt it lots of times. From the looks of it there's not an easy way.
There's maybe a tool out there that is similar to Xenu, but checks the index status in Google also. I haven't ever had the need for this so I'm not aware of one, but the chances are there is something out there.
Good luck!
-
Any ideas on how to go about exporting indexed urls?
-
Hi Peter,
I'd attempt some sort of export of both indexed URLs and actual URLs into an Excel file and try and remove duplicates.
You would need to look into it but I'm sure there's a way of matching and removing duplicates.
Other than that I wouldn't know.
Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirects and site map isn't showing
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening: 1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools 2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects Have Bluehost messed things up? Hope you can help thanks
Technical SEO | | Caffeine_Marketing0 -
I am looking for best way to block a domain from getting indexed ?
We have a website http://www.example.co.uk/ which leads to another domain (https://online.example.co.uk/) when a user clicks,in this case let us assume it to be Apply now button on my website page. We are getting meta data issues in crawler errors from this (https://online.example.co.uk/) domain as we are not targeting any meta content on this particular domain. So we are looking to block this domain from getting indexed to clear this errors & does this effect SERP's of this domain (**https://online.example.co.uk/) **if we use no index tag on this domain.
Technical SEO | | Prasadgotteti0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
Best way to redirect friendly URL in direct mail ?
Hi, When we do direct mail to our customers talking about a specific product we sell we usually put a link in the letter so the customer can go directly to the product just by typing a short link, something like:
Technical SEO | | BigJoe
www.example.com/blue-widget This link will then re-direct to:
www.example.com/shop/product/brand-name-big-blue-widget-with-green-ends-200m-50diameter.php Which we are happy with at the moment but I want to check we are doing it correctly in terms of redirects, we currently re-direct it using .htaccess like:
Redirect /blue-widget http://www.example.com/shop/product/brand-name-big-blue-widget-with-green-ends-200m-50diameter.php This re-directs it as a 302 but should it be done as a 301 ? I am not sure why we did 302's to start with but I am thinking they should be 301's, I think it might have been because the URL we were redirecting from was imaginary ? Also should we use the Redirect line in the .htaccess or should we do each one with a RewriteRule ? Thanks BigJoe0 -
Changes to website haven't been crawled in over a month
We redesigned our website at http://www.aptinting.com a few months ago. We were fully expecting the crawl frequency to be very low because we had redesigned the website from a format that had been very static, and that probably has something to do with the problem we're currently having. We made some important changes to our homepage about a month ago, and the cached version of that page is still from April 2nd. Yet, whenever we create new pages, they get indexed within days. We've made a point to create lots of new blog articles and case studies to send a message to Google that the website should be crawled at a greater rate. We've also created new links to the homepage through press releases, guest blog articles, and by posting to social media, hoping that all of these things would send a message to Google saying that the homepage should be "reevaluated". However, we seem to be stuck with the April 2nd version of the homepage, which is severely lacking. Any suggestions would be greatly appreciated. Thanks!
Technical SEO | | Lemmons0 -
Fowarding URL's Have No SEO Value?
Good Morning from -3 Degrees C no paths gritted wetherby UK 😞 Imagine this scenario. http://www.barrettsteel.com/ has been optimised for "Steel suppliers" & "Steel stockholders". After runnning an on page SEO moz report its recommended that the target terms should be placed in the url eg www.steel-suppliers.co.uk Now the organisation will not change the url but think setting up a forwarding url eg registering www.steel-suppliers.co.uk to then forward to www.steel-suppliers.co.uk will be of benfit from an SEO perspective. But i think not. So my question is please "is a forwarding url of no value but a permanent URL (struggling for the terminology to describe the url a site is set up with) such as www.steel-suppliers.co.uk would be of value?" Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Switching ecommerce CMS's - Best Way to write URL 301's and sub pages?
Hey guys, What a headache i've been going through the last few days trying to make sure my upcoming move is near-perfect. Right now all my urls are written like this /page-name (all lowercase, exact, no forward slash at end). In the new CMS they will be written like this: /Page-Name/ (with the forward slash at the end). When I generate an XML sitemap in the new ecomm CMS internally it lists the category pages with a forward slash at the end, just like they show up through out the CMS. This seems sloppy to me, but I have no control over it. Is this OK for SEO? I'm worried my PR 4, well built ecommerce website is going to lose value to small (but potentially large) errors like this. If this is indeed not good practice, is there a resource about not using the forward slash at the end of URLS in sitemaps i can present to the community at the platform? They are usually real quick to make fixes if something is not up to standards. Thanks in advance, -First Time Ecommerce Platform Transition Guy
Technical SEO | | Hyrule0