Google Search Results...
-
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100.
The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000.
Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains.
Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
-
I see. If you have some idea of what section of your site might be in there that you don't want, you can use site:company.com inurl:whatever to narrow it down. You should know the file or call for search and shop pages and can put that name after the inurl modifier.
-
The goal is to identify what pages are Google indexing and are there ones it shouldn't. (We don't index search pages, we don't index basket or checkout pages)
I do know know all of the subdomains and searching them individually isn't making up the total search count when I do site:company.com.
I don't have duplicate pages from my moz reports so it can't be that. If I was able to download a full google search result into a spreadsheet. I could quickly filter and see what pages are being indexed that shouldn't.
-
Ok, but what's your goal with this? And why don't you know your own subdomains that you've created? It seems like you could work backwards from a better starting point by applying those things.
-
My GA is only focused on a single domain, as subdomains hold just PDFs, images etc. Traffic reports from GA are focused on www.company.com pages.
The only way I can know exactly which URLS have been indexed, seems to be going through the google search results, but it caps after 7 pages
-
Hi Cyto. Why don't you try exporting pages receiving google/organic visits from Google Analytics using the Landing Page metric as a secondary dimension... It won't be all inclusive, but it will give you a good idea on what pages are indexed and drawing in visitors. You can then compare that data against your sitemaps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silo Structure in the eye of google?
does silo structure has a positive point on Google Ranking or not, and what is the importance of internal linking, how google see the internal linking content as compared to less internal linking, I'm trying an experiment I do a lot of internal backlinking in Website Unionwell as compared to Website B (which has apparently less internal Links) so with your experience in SEO field which site will get traffic rapidly.
Intermediate & Advanced SEO | | saimkhanna0 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Change of Address in Google Search Console
I have merged domains before and it went rather smoothly following the Moz Guide - https://moz.com/blog/save-your-website-with-redirects . I've got a new challenge ahead of me though in that a client is buying the blog subdirectory associated with another domain. So it's the blog only, not the complete domain therefore a change of address for a site section doesn't exist. I believe the course of action will be the same except we'll just skip the change of address step since the original owner wants to maintain the TLD. Part of the contract is that we'll get the content which will be ported over to our domain and he'll maintain the 301's as requested and into perpetuity. Our domain is not brand new and has some credible links. Anyone encounter a transition of a partial domain before? Thanks for your help/suggestions.
Intermediate & Advanced SEO | | seoaustin0 -
Mobile Search Results Include Pages Meant Only for Desktops/Laptops
When I put in site:www.qjamba.com on a mobile device it comes back with some of my mobile-friendly pages for that site(same url for mobile and desktop-just different formatting), and that's great. HOWEVER, it also shows a whole bunch of the pages (not identified by Google as mobile-friendly) that are fine for desktop users but are not supposed to exist for the mobile users, because they are too slow. Until a few days ago those pages were being redirected for mobile users to the home page. I since have changed that to 404 not founds. Do we know that Google keeps a mobile index separate from the desktop index? If so, I would think that 404 should work.. How can I test whether the 404 not founds will remove a url so they DON'T appear on a mobile device when I put in site:www.qjamba.com (or a user searches) but DO appear on a desktop for the same command.
Intermediate & Advanced SEO | | friendoffood0 -
Search Refinement URLs
My site is using search refinement and I am concerned about the URL adding additional characters when it's refined. My current URL is: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm and when someone chooses their specific year, make, and model then it changes to: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm?searching=Y&Cat=10280&RefineBy_7371=7708. Will this negatively affect SEO for this URL? Will the URL be counted twice? Any help would be great!
Intermediate & Advanced SEO | | BrandLabs0 -
How to Improve or Recover Google Image Search Performance (Queries, Impressions, Clicks)?
I want to improve or recover Google image search performance for my eCommerce website. My website was working well in Google image search but, I found negative performance since implementation of CDN for all images. Before CDN my image path was as follow. http://www.vistastores.com/media/catalog/product/cache/1/image/265x/9df78eab33525d08d6e5fb8d27136e95/1/0/10133_1.jpg After CDN my image path is as follow. http://lghttp.11720.nexcesscdn.net/805298/images/media/catalog/product/cache/1/image/900x800/9df78eab33525d08d6e5fb8d27136e95/1/0/10133_1.jpg I can see that, Google image search performance is going down after set up CDN on my website. Because, all images are available on external server. So, How to recover Google image search performance after CDN or any idea to improve performance? 6871173584_a85e22ce1c_b.jpg
Intermediate & Advanced SEO | | CommercePundit0 -
Alexa site title shows as "302 Found" on search result pages
If you search for the site "ixl.com" in Alexa, for some reason, it's showing the site as "302 Found" instead of showing the website name, IXL. If you drill into that, it shows the site as ixl.com, but underneath that, it says "302 Found" again. Every other site I search for seems to show the site's name properly. I have no idea where it's getting this "302 Found" from. Does anyone know how to fix this? Here's a link directly to the search results page: http://www.alexa.com/search?q=ixl.com
Intermediate & Advanced SEO | | john4math0 -
De-indexing search results noindex, follow or noindex, nofollow
If search results were not originally blocked with robots.txt, and need to be de-indexed, is it better to use noindex, nofollow or noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0