How can I make a list of all URLs indexed by Google?
-
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap.
The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google.
Anyone?
(I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
-
If you can get a developer to create a list of all the pages Google has crawled within a date range then you can use this python script to check if the page is indexed or not.
http://searchengineland.com/check-urls-indexed-google-using-python-259773
The script uses the info: search feature to check the urls.
You will have to install Python, Tor and Polipo for this to work. It is quite technical so if you aren't a technical person you may need help.
Depending on how many URL's you have and how long you decide to wait before checking each URL, it can take a few hours.
-
Thanks for your input guys! I've almost landed on the following approach:
- Use this http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/ to collect a number (3-600) of URLs based on the various problem URL-footprints.
- Make XML "problem sitemaps" based on above URLs
- Implement 301s
- Ping the search engines with the XML "problem sitemaps", so that these may discover changes and see what the site really looks like (ideally reducing the # of indexed pages by about 85%)
- Track SE traffic as well as index for each URL footprint once a week for 6-8 weeks and follow progress
- If progress is not satisfactory, then go the URL Profiler route.
Any thoughts before I go ahead?
-
URL profiler will do this, as well as the other recommend scraper sites.
-
URL Profiler might be worth checking out:
It does require that you use a proxy, since Google does not like you scraping their search results.
-
Im sorry to confirm you that google does not want to everyine know that they have in their index. We as SEOs complain about that.
Its hard to belive that you couldnt get all your pages with a scraper. (because it just searches and gets the SERPS)
-
I tried thiss and a few others http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/. This gave me about 500-1000 URLs at a time, but included a lot of cut and paste back and forth.
I imagine there must be a much easier way of doing this...
-
Well, There are some scrapers that might do that job.
To do it the right way you will need proxies and a scraper.
My recommendation is Gscraper or Scrapebox and a list of (at list) 10 proxies.Then, just make a scrape whit the "site:mydomain.com" and see what you get.
(before buying proxies or any scraper, check if you get something like you want with the free stuff) -
I used Screaming to discover the spider trap (and more), but as far as I know, I cannot use Screaming to import all URLs that Google actually has in its index (or can I?).
A list of URLs actually in Googles index is what I'm after
-
Hi Sverre,
Have you tried Screaming Frog SEO Spider? Here a link to it: https://www.screamingfrog.co.uk/seo-spider/
It's really helpfull to crawl all the pages you have as accesible for spiders. You might need the premium version to crawl over 500 pages.
Also, have you checked for the common duplicate pages issues? Here a Moz tutorial: https://moz.com/learn/seo/duplicate-content
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same URL-Structure & the same number of URLs indexed on two different websites - can it lead to a Google penalty?
Hey guys. I've got a question about the url structure on two different websites with a similar topic (bith are job search websites). Although we are going to publish different content (texts) on these two websites and they will differ visually, the url structure (except for the domain name) remains exactly the same, as does the number of indexed landingpages on both pages. For example, www.yyy.com/jobs/mobile-developer & www.zzz.com/jobs/mobile-developer. In your opinion, can this lead to a Google penalty? Thanks in advance!
Intermediate & Advanced SEO | | vde130 -
Can Google bypass an AJAX link?
On my company's events calendar page when you click an event, it populates and overlay using AJAX, and then the link that is populated in that overlay then takes you to the actual events page. I see this as a problem with Google because it can't follow the AJAX link to the true event page, so right now nothing on those pages is getting indexed and we can't utilize our schema to get events to populate in the Google rich snippets or the knowledge graph. Possible solutions I considered: 1. Remove the AJAX overlay and allow the link from the events calendar to go directly to the individual event. 2. Leave the AJAX overlay and try to get the individual event pages directly indexed in Google. Thoughts and suggestions are greatly appreciated!
Intermediate & Advanced SEO | | MJTrevens0 -
Can I change a URL on a site that has only a few back links?
I have a site that wants to change their URL, It's a very basic site with hardly any backlinks. http://www.cproofingandexteriors.com/ The only change they want to make is taking out the 'and'.. so it would be cproofingexteriors.com they already own the domain. What should I do?? Thanks
Intermediate & Advanced SEO | | MissThumann0 -
Google can't access/crawl my site!
Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
Intermediate & Advanced SEO | | granitgash
Granit FvhvDVR.png dKx3m1O.png0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Multiple Google Places Listings?
Hi everyone. While I have read answers regarding this on Mike Blumenthal's blog, I have not been able to get an exact clarification on having multiple Google Places listings. According to Mike Blumethal, Google accepts multiple listings in the Places area for specific industries. e.g. One listing for a Dental office, one listing for EACH dentist. This could include a separate website for each. If this is the case, how far away are we from having one maxed out business owning muiiple positions in the local listing space in the search engines. specifically Google? I would love a good explanation of what is and isn't allowed to have multiple listings.
Intermediate & Advanced SEO | | dignan991