How can I get a list of every url of a site in Google's index?
-
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is.
Is there a way to get an excel sheet with every url Google has indexed for a site?
Thanks... Mike
-
If this is still an issue you're facing, have you checked the sitemap settings to see which page types are getting included? For example, a site with a few thousand tags that are not entered in the sitemap but not yet set to noindex could easily produce extra pages like this.
The next step is parameterization. Anything going on there with search URLs or product URLs? eg ?refid=1235134&q=search+term or ?prod=152134&variant=blue
If you really want to scrape through Google, get a list of your sitemap and scrape queries like "inurl:domain.com/a", "inurl:domain.com/b", "inurl:domain.com/c". etc. This should allow you to dive deeper into the site map to see what Google really has indexed. For URL subfolders with tons of URLs like domain.com/product/a, you'll want to do the same thing at a subfolder level instead of root URLs.
-
You can do that with a tool like Scrapebox or Outwit. Go slow, or else you'll need to use proxies to get Google to respond fast enough. As another commenter mentioned, it's probably against TOS.
-
You could probably write a macro to do this, although just because you could doesn't mean you should. I don't think it is advisable because you do not want to violate any terms of use for anyone. That is never a good thing.
-
Yes, WMT API doesn't have it. The site site:xxxx.com search is where are got one of the two too high numbers. Thanks... Mike
-
Hi Marijn,
Thanks for the suggestions. 2.5 years of G/A organic landing pages is 10,000 urls.... 1/2 as many as the site map and 1/3rd as many as Google says indexed. On scraping google, do you know of a tool for that?
Thanks... Mike
-
Might be something you can get from the WMT API.
Also, to really see how many pages are indexed, do a site:xxxx.com search, go to the last page, include omitted results, go to the last page again, and add up how many you have. That's probably the most accurate number.
-
Hi Mike,
There a couple of solutions, neither of them provide you with 100% of data. The best would be to export a list of landing pages from Google Analytics or your favorite web analytics tool segmented by organic search/ Google. This would provide you with a list of pages that received traffic via search and so are indexed. If you cross reference them with your sitemaps that might already help you out a bit. Besides that you could crawl and scrape the URLS for a site:xxx.com search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing URL's During a Site Redesign
What are the effects of changing URL's during a site redesign following all of the important processes (ie: 301 redirects, reindexing in google, submitting a new sitemap) ?
Intermediate & Advanced SEO | | jennifer-garcia0 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
We're indexed in Google News, any tips or suggestions for getting traffic from news?
We have a news sitemap, and follow all best practices as outlined by Google for news. We are covering breaking stories at the same time as other publications, but have only made it to the front page of Google News once in the last few weeks. Does anyone have any tips, recommended reading, etc for how to get to the front page of Google News? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Does Google index url with hashtags?
We are setting up some Jquery tabs in a page that will produce the same url with hashtags. For example: index.php#aboutus, index.php#ourguarantee, etc. We don't want that content to be crawled as we'd like to prevent duplicate content. Does Google normally crawl such urls or does it just ignore them? Thanks in advance.
Intermediate & Advanced SEO | | seoppc20120 -
Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
Hi, I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible. However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
Intermediate & Advanced SEO | | mindflash0 -
Can you explain why the site is dropping off Google every other week?
Can anyone offer any insight into why since the Google Panda update www.bedandbreakfastsguide.com has been fluctuating on Google so much? One week it's ranked as it used to be, the next it's nowhere to be seen? If you take a look at the screenshot of our traffic, this is the traffic after 75% loss (dropped in two stages) you'll see we get traffic for a week and then nothing. This has been happening for months. Some points that might be involved: Around the same time the SEO guys suggested setting the canonical url to www.bedandbreakfastsguide.com (before there wasn't one so traffic was coming from www. and non-www). A lot of the original urls have been consolidated and rel="canonical" added throughout The "pages" of results all have had a rel="canonical" set to page 1 Could it be that the www is competing with the non-www despite the 301 redirects. We're doing everything we can to help this client (and have reduced their site errors from the millions to low tens-of-thousands) so it's not filling them with confidence when their site just keeps plumetting! What's also irritating/odd is that some of their competitors -who used to be ranked lower and have sites which contradict every rulebook still rank high. Hopefully you can spot something we've missed. Tim I8PNL
Intermediate & Advanced SEO | | TimGaunt0 -
My site has multiple H1's, one in the logo image and one as a header. Is there any official stance from the search engines on this?
In doing some research on this issue, I came across this blog post which seems to suggest it certainly will be a trigger to search engines. http://www.seounique.com/blog/multiple-h1-tags-triggers-google-penalty/ Could be a false positive on his specific case, but I was wondering what the community thought. Thanks in advance!
Intermediate & Advanced SEO | | jim_shook0