Download all GSC crawl errors: Possible today?
-
Hey guys:
I tried to download all the crawl data from Google Search Console using the API and solutions like this one: https://github.com/eyecatchup/php-webmaster-tools-downloads but seems that is not longer working (or I made something wrong, I just receive a blank page when running the PHP file after some load time)... I needed to download more than 1.000 URLs long time ago, so I didn't tried to use this method since then.
Is there any other solution using the API to grab all the crawl errors, or today this is not possible anymore?
Thanks!
-
Hi Antonio,
Not sure which language you prefer - but you can find some sample codes here: https://developers.google.com/webmaster-tools/v3/samples - I tried the python example which was quite well documented inside the code, I guess it's the same for the other languages. If I have some time I could give it a try - but it won't be before the end of next week (and based on python)
Dirk
-
Thanks Dirk. At the moment I couldn't find any alternative, so maybe will be a good idea put some hands on this.
If any other person solved this, would be great if can share it with us the solution -
The script worked for the previous version of the API - it won't work on the current version.
You try to search to check if somebody else has created the same thing for the new API - or build something your self - the API is quite well documented so it shouldn't be to difficult to do. I build a Python script for the Search Analytics part in less than a day (without previous knowledge of Python) so it's certainly feasible.rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will google be able to crawl all of the pages given that the pages displayed or the info on a page varies according to the city of a user?
So the website I am working for asks for a location before displaying the product pages. There are two cities with multiple warehouses. Based on the users' location, the product pages available in the warehouse serving only in that area are shown. If the user skips location, default warehouse-related product pages are shown. The APIs are all location-based.
Intermediate & Advanced SEO | | Airlift0 -
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side. I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work. Also, how can I build a test to see if it does or does not work?
Intermediate & Advanced SEO | | MJTrevens0 -
Crawl budget
I am a believer in this concept, showing google less pages will increase their importance. here is my question: I manage a website with millions of pages, high organic traffic (lower than before). I do believe that too many pages are crawled. there are pages that I do not need google to crawl and followed. noindex follow does not save on the mentioned crawl budget. deleting those pages is not possible. any advice will be appreciated. If I disallow those pages I am missing on pages that help my important pages.
Intermediate & Advanced SEO | | ciznerguy2 -
Wordpress to HubSpot CMS - I had major crawl issues post launch and now traffic is down 400%
Hi there good looking person! Our traffic went from 12k visitors in july to 3k visitors in july. << www.thedsmgroup.com >>When we moved our site from wordpress to the hubspot COS (their CMS system), I didnt submit a new sitemap to google webmaster tools. I didn't know that I had to... and to be honest, I've never submitted or re-submitted a sitemap to GWT. I have always built clean sites with fresh content and good internal linking and never worried about it. Yoast kind of took care of the rest, as all of my sites and our clients' sites were always on wordpress. Well, lesson learned. I got this message on June 27th in GWT_http://www.thedsmgroup.com/: Increase in not found errors__Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site's pages._One month after our site launched we had 1,000 404s on our website. Ouch. Google thought we had a 1,200 page website with only 200 good pages and 1,000 error pages. Not very trust worthy... We never had a 404 ever before this, as we added a plugin to wordpress that would 301 any 404 to the homepage, so we never had a broken link on our site, which is not ideal for UX, but as far as google was concerned, our site was always clean. Obviously I have submitted a new sitemap to GWT a few weeks ago, and we are moving in the right direction... **but have I taken care of everything I need to? I'm not sure. Our traffic is still around 100 visitors per day, not 400 per day as it was before we launched the new site.**Thoughts?I'm not totally freaking out or anything, but a month ago we ranked #1 and #2 for "marketing agency nj", now we aren't in the top 100. I've never had a problem like this. _I added a few screen grabs from Google Webmaster Tools that should be helpful.__Bottom line, have I done everything I need to or do I need to do something with all of these "not found" error details that I have in GWT?_None of these "not found" pages have any value and I'm not sure how Google even found them... For example: http://www.thedsmgroup.com/supersize-page-test/screen-shot-2012-11-06-at-2-33-22-pmHelp! -JasonuhLLtou&h4QmGCW#0 uhLLtou&h4QmGCW#1
Intermediate & Advanced SEO | | Charlene-Wingfield0 -
Are Incorrectly Set Up URL Rewrites a Possible Cause of Panda
On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay. However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be. Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this. When I look at the source code of a product page, I see a reference to the /modules/products... I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks
Intermediate & Advanced SEO | | ABK7170 -
Duplicate site (disaster recovery) being crawled and creating two indexed search results
I have a primary domain, toptable.co.uk, and a disaster recovery site for this primary domain named uk-www.gtm.opentable.com. In the event of a disaster, toptable.co.uk would get CNAMEd (DNS alias) to the .gtm site. Naturally the .gtm disaster recover domian is an exact match to the toptable.co.uk domain. Unfortunately, Google has crawled the uk-www.gtm.opentable site, and it's showing up in search results. In most cases the gtm urls don't get redirected to toptable they actually appear as an entirely separate domain to the user. The strong feeling is that this duplicate content is hurting toptable.co.uk, especially as .gtm.ot is part of the .opentable.com domain which has significant authority. So we need a way of stopping Google from crawling gtm. There seem to be two potential fixes. Which is best for this case? use the robots.txt to block Google from crawling the .gtm site 2) canonicalize the the gtm urls to toptable.co.uk In general Google seems to recommend a canonical change but in this special case it seems robot.txt change could be best. Thanks in advance to the SEOmoz community!
Intermediate & Advanced SEO | | OpenTable0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770