How can I get a list of every url of a site in Google's index?
-
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is.
Is there a way to get an excel sheet with every url Google has indexed for a site?
Thanks... Mike
-
If this is still an issue you're facing, have you checked the sitemap settings to see which page types are getting included? For example, a site with a few thousand tags that are not entered in the sitemap but not yet set to noindex could easily produce extra pages like this.
The next step is parameterization. Anything going on there with search URLs or product URLs? eg ?refid=1235134&q=search+term or ?prod=152134&variant=blue
If you really want to scrape through Google, get a list of your sitemap and scrape queries like "inurl:domain.com/a", "inurl:domain.com/b", "inurl:domain.com/c". etc. This should allow you to dive deeper into the site map to see what Google really has indexed. For URL subfolders with tons of URLs like domain.com/product/a, you'll want to do the same thing at a subfolder level instead of root URLs.
-
You can do that with a tool like Scrapebox or Outwit. Go slow, or else you'll need to use proxies to get Google to respond fast enough. As another commenter mentioned, it's probably against TOS.
-
You could probably write a macro to do this, although just because you could doesn't mean you should. I don't think it is advisable because you do not want to violate any terms of use for anyone. That is never a good thing.
-
Yes, WMT API doesn't have it. The site site:xxxx.com search is where are got one of the two too high numbers. Thanks... Mike
-
Hi Marijn,
Thanks for the suggestions. 2.5 years of G/A organic landing pages is 10,000 urls.... 1/2 as many as the site map and 1/3rd as many as Google says indexed. On scraping google, do you know of a tool for that?
Thanks... Mike
-
Might be something you can get from the WMT API.
Also, to really see how many pages are indexed, do a site:xxxx.com search, go to the last page, include omitted results, go to the last page again, and add up how many you have. That's probably the most accurate number.
-
Hi Mike,
There a couple of solutions, neither of them provide you with 100% of data. The best would be to export a list of landing pages from Google Analytics or your favorite web analytics tool segmented by organic search/ Google. This would provide you with a list of pages that received traffic via search and so are indexed. If you cross reference them with your sitemaps that might already help you out a bit. Besides that you could crawl and scrape the URLS for a site:xxx.com search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can subdomains hurt your primary domain's SEO?
Our primary website https://domain.com has a subdomain https://subDomain.domain.com and on that subdomain we have a jive-hosted community, with a few links to and fro. In GA they are set up as different properties but there are many SEO issues in the jive-hosted site, in which many different people can create content, delete content, comment, etc. There are issues related to how jive structures content, broken links, etc. My question is this: Aside from the SEO issues with the subdomain, can the performance of that subdomain negatively impact the SEO performance and rank of the primary domain? I've heard and read conflicting reports about this and it would be nice to hear from the MOZ community about options to resolve such issues if they exist. Thanks.
Intermediate & Advanced SEO | | BHeffernan1 -
Canonical URL's For Two Domains
We have two websites, one we use for Google PPC (website 1) and one (website 2) we use for everything else. The reason is we are in an industry that Google Adwords doesn't like, so we built a whole other website that removes the product descriptions as Google Adwords doesn't approve of many of them (nutrition). Right now we have that Google Adwords approved website (website 1) no-index/no-follow because we didn't want to run into potential duplicate content issues in free search, but the issue is we can't submit it to Google Shopping...as they require it to be indexable. Do you think removing the no-index/no-follow from that website 1 and adding canonical URL's pointing to website 2 would resolve this issue (being able to submit it to Google Shopping) and not cause any problems with duplicate content? I was thinking of adding the canonical tag to all pages of website 1 and point it to website 2. Does that make sense? Do you think that would work?
Intermediate & Advanced SEO | | vetofunk0 -
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
Can I change a URL on a site that has only a few back links?
I have a site that wants to change their URL, It's a very basic site with hardly any backlinks. http://www.cproofingandexteriors.com/ The only change they want to make is taking out the 'and'.. so it would be cproofingexteriors.com they already own the domain. What should I do?? Thanks
Intermediate & Advanced SEO | | MissThumann0 -
Migrating From Parameter-Driven URL's to 'SEO Friendly URL's (Slugs)
Hi all, hope you're all good and having a wonderful Friday morning. At the moment we have over 20,000+ live products on our ecomms site, however, all of the products are using non-seo friendly URL's (/product?p=1738 etc) and we're looking at deploying SEO friendly url's such as (/product/this-is-product-one) etc. As you could imagine, making such a change on a big ecomms site will be a difficult task and we will have to take on A LOT of content changes, href-lang changes, affiliate link tests and a big 301 task. I'm trying to get some analysis together to pitch the Tech guys, but it's difficult, I do understand that this change has it's benefits for SEO, usability and CTR - but I need some more info. Keywords in the slugs - what is it's actual SEO weight? Has anyone here recently converted from using parameter based URL's to keyword-based slugs and seen results? Also, what are the best ways of deploying this? Add a canonical and 301? All comments greatly appreciated! Brett
Intermediate & Advanced SEO | | Brett-S0 -
"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly. We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text. Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface. What is the best way to execute this?
Intermediate & Advanced SEO | | tatermarketing0 -
Website penalised can't find where the problem is. Google went INSANE
Hello, I desperately need a hand here! Firstly I just want to say that I we never infracted google guidelines as far as we know. I have been around in this field for about 6 years and have had success with many websites on the way relying only in natural SEO and was never penalised until now. The problem is that our website www.turbosconto.it is and we have no idea why. (not manual) The web has been online for more than 6 months and it NEVER started to rank. it has about 2 organic visits a day at max. In this time we got several links from good websites which are related to our topic which actually keep sending us about 50 visits a day. Nevertheless our organic visita are still 1 or 2 a day. All the pages seem to be heavily penalised ... when you perform a search for any of our "shops"even including our Url, no entries for the domain appear. A search example: http://www.turbosconto.it zalando What I will expect to find as a result: http://www.turbosconto.it/buono-sconto-zalando The same case repeats for all of the pages for the "shops" we promote. Searching any of the brads + our domain shows no result except from "nike" and "euroclinix" (I see no relationship between these 2) Some days before for these same type of searches it was showing pages from the domain which we blocked by robots months ago, and which go to 404 error instead of our optimised landing pages which cannot be found in the first 50 results. These pages are generated by our rating system... We already send requests to de index all theses pages but they keep appearing for every new page that we create. And the real pages nowhere to be found... Here isan example: http://www.turbosconto.it/shops/codice-promozionale-pimkie/rat
Intermediate & Advanced SEO | | sebastiankoch
You can see how google indexes that for as in this search: site:www.turbosconto.it rate Why on earth will google show a page which is blocked by the robots.txt displaying that the content cannot retrieved because it is blocked by the robots instead of showing pages which are totally SEO Friendly and content rich... All the script from TurboSconto is the same one that we use in our spanish version www.turbocupones.com. With this last one we have awesome results, so it makes things even more weird... Ok apart from those weird issues with the indexation and the robots, why did a research on out backlinks and we where surprised to fin a few bad links that we never asked for. Never the less there are just a few and we have many HIGH QUALITY LINKS, which makes it hard to believe that this could be the reason. Just to be sure we, we used the disavow tool for these links, here are the bad links we submitted 2 days ago: domain: www.drilldown.it #we did not ask for this domain: www.indicizza.net #we did not ask for this domain: urlbook.in #we did not ask for this, moreover is a spammy one http://inpe.br.way2seo.org/domain-list-878 #we did not ask for this, moreover is a spammy one http://shady.nu.gomarathi.com/domain-list-789 #we did not ask for this, moreover is a spammy one http://www.clicdopoclic.it/2013/12/i-migliori-siti-italiani-di-coupon-e.html #we did not ask for this, moreover and is a copy of a post of an other blog http://typo.domain.bi/turbosconto.it I have no clue what can it be, we have no warning messages in the webmaster tools or anything.
For me it looks as if google has a BUG and went crazy on judging our italian website. Or perhaps we are just missing something ??? If anyone could throw some light on this I will be really glad and willing to pay some compensation for the help provided. THANKS A LOT!0