Setting A Custom User Agent in Screaming Frog
-
Hi all,
Probably a dumb question, but I wanted to make sure I get this right.
How do we set a custom user agent in Screaming Frog? I know its in the configuration settings, but what do I have to do to create a custom user agent specifically for a website?
Thanks much!
- Malika
-
Setting a custom user agent determines things like HTTP/2 so there can be a big difference if you change it to something that might not take advantage of something like HTTP/2
Apparently, it is coming to Pingdom very soon just like it is to Googlebot
http://royal.pingdom.com/2015/06/11/http2-new-protocol/
This Is an excellent example of a user agent's ability to modify the way your site is crawled as well as how efficient it is.
https://www.keycdn.com/blog/https-performance-overhead/
It is important to note that we didn’t use Pingdom in any of our tests because they use Chrome 39, which doesn’t support the new HTTP/2 protocol. HTTP/2 in Chrome isn’t supported until Chrome 43. You can tell this by looking at the
User-Agent
in the request headers of your test results.Pingdom user-agent
Note: WebPageTest uses Chrome 47 which does support HTTP/2.
Hope that clears things up,
Tom
-
Hi Malika,
Think about screaming frog and what it has to detect in order to do that correctly it needs the correct user agent syntax for it will not be able to make a crawl that would satisfy people.
Using a proper syntax for a user agent is essential and I have tried to be non-technical in this explanation I hope it works.
the reason screaming frog needs the user agent because the user-agent was added to HTTP to help web application developers deliver a better user experience. By respecting the syntax and semantics of the header, we make it easier and faster for header parsers to extract useful information from the headers that we can then act on.
Browser vendors are motivated to make web sites work no matter what specification violations are made. When the developers building web applications don’t care about following the rules, the browser vendors work to accommodate that. It is only by us application developers developing a healthy respect
When the developers building web applications don’t care about following the rules, the browser vendors work to accommodate that. It is only by us application developers developing a healthy respect
It is only by us application developers developing a healthy respect for the standards of the web, that the browser vendors will be able to start tightening up their codebase knowing that they don’t need to account for non-conformances.
For client libraries that do not enforce the syntax rules, you run the risk of using invalid characters that many server side frameworks will not detect. It is possible that only certain users, in particular, environments would identify the syntax violation. This can lead to difficult to track down bugs.
I hope this is a good explanation I've tried to keep it very to the point.
Respectfully,
Thomas
-
Hi Thomas,
would you have a simpler tutorial for me to understand? I am struggling a bit.
Thanks heaps in advance
-
I think I want something that is dumbed down to my level for me to understand. The above tutorials are great but not being a full time coder, I get lost while reading those.
-
Hi Matt,
I havent had a luck with this one yet.
-
Hi Malika! How'd it go? Did everything work out?
-
happy I could be of help let me know if there's any issue and I will try to be of help with it. All the best
-
Hi Thomas,
That's a lot of useful information there. I will have a go on it and let you know how it went.
Thanks heaps!
-
please let me know if I did not answer the question or you have any other questions
-
this gives you a very clear breakdown of user agents and their set of syntax rules. The following is valid example of user-agent that is full of special characters,
read this please http://www.bizcoder.com/the-much-maligned-user-agent-header
user-agent: foo&bar-product!/1.0a$*+ (a;comment,full=of/delimiters
references but you want to pay attention to the first URL
https://developer.mozilla.org/en-US/docs/Web/HTTP/Gecko_user_agent_string_reference
| Mozilla/5.0 (X11; Linux i686; rv:10.0) Gecko/20100101 Firefox/10.0 |
http://stackoverflow.com/questions/15069533/http-request-header-useragent-variable
-
if you formatted it correctly see below
User-Agent = product *( RWS ( product / comment ) )
and it was received by your headers yes you could fill in the blanks and test it.
https://mobiforge.com/research-analysis/webviews-and-user-agent-strings
http://mobiforge.com/news-comment/standards-and-browser-compatibility
-
No, you Cannot just put anything in there. The site has to recognize it and ask why you are doing this?
I have listed how to build and already built in addition to what your browser will create by using useragentstring.com
Must be formatted correctly and have it work with a header it is not as easy as it sometimes seems but not that hard either.
You can make & use this to make your own from your Mac or PC
http://www.useragentstring.com/
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2747.0 Safari/537.36
how to build a user agent
- https://developer.mozilla.org/en-US/docs/Web/HTTP/Gecko_user_agent_string_reference
- https://developer.mozilla.org/en-US/docs/Setting_HTTP_request_headers
- https://msdn.microsoft.com/en-us/library/ms537503(VS.85).aspx
Lists of user agents
https://support.google.com/webmasters/answer/1061943?hl=en
https://msdn.microsoft.com/en-us/library/ms537503(v=vs.85).aspx
-
Hi Thomas,
Thanks for responding, much appreciated!
Does that mean, if I type in something like -
HTTP request user agent -
Crawler access V2
&
Robots user agent
Crawler access V2
This will work too?
-
To crawl using a different user agent, select ‘User Agent’ in the ‘Configuration’ menu, then select a search bot from the drop-down or type in your desired user agent strings.
http://i.imgur.com/qPbmxnk.png
&
Video http://cl.ly/gH7p/Screen Recording 2016-05-25 at 08.27 PM.mov
Or
Also see
http://www.seerinteractive.com/blog/screaming-frog-guide/
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#user-agent
https://www.screamingfrog.co.uk/seo-spider/user-guide/
https://www.screamingfrog.co.uk/seo-spider/faq/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I've all the things set up, still keywords are not rankign anywhere in Google.
No results, only just a few - site:10stuffs.com All the search results can be visible through manual URL searching... No manual actions or any technical fault detected. I'm wondering what's wrong with my site and why It's not gelling on with the Google. 10stuffs.com
Intermediate & Advanced SEO | | stuffsurya0 -
User Intent - Office Chairs & Content Writing
Hi I'm trying to look for ideas for content on office chairs. Any ideas on where to start with user intent for this type of search query? I'm using answer the public to gather some ideas. Some are good ideas, but I can't actually find any search volume for the phrase so then I'm unsure whether to devote time to writing something. Surely most people want to just find a supplier, buy the chair & they don't want a huge informational piece how to buy a chair. Our competitors are the likes of amazon, and a load of other huge companies with high DA - so I'm looking at types of content we can write, people are interested in reading about chairs, which is less competitive..I'm not sure that exists... Any help is appreciated 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Where/how do you set up 301 redirects when keeping the same domain and not preserving the filename?
Hi there, I'm just reaching to to ask for some help in understanding where 301 redirects should be set up on a website when keeping the same domain but not preserving the original filenames? Essentially what is happening is an old website is being completely overhauled and brought up to date from a technical and usability standpoint. While the SEO isn't great naturally many of the pages have been indexed by google over time. A few pages have decent statistics and I don't want to lose the juice from them, but they do still need a lot of improving. So my question is this, would all the redirection take place in the .htaccess file only in this case? From reading here on Moz I think this is the case, but I need to confirm that. I was reading this article which has thrown me slightly: https://moz.com/learn/seo/redirection but this seems more complex as the website was actually moving domains. Open to any insight and if you need further clarification or information let me know.
Intermediate & Advanced SEO | | SEODarren0 -
Best Practice for setting up expert author contributing to Multiple Sites?
If a single author contributes to multiple sites, should each site have its own author page (tying to the same single gg+ account)? Ex. One author > one gg+ account > multiple author pages (one per site) Or, should all sites publishing his content link to a single author page/bio on a single, main site? Ex. One author > one gg+ account > a single author page on one site (all other sites link to this author page) In this event, where would the 'contributor to' link point for the additional sites he is contributing to, the homepage? Thanks!
Intermediate & Advanced SEO | | seagreen0 -
Parameters when deciding how to set up cross domain analytics?
Hi, I am planning on setting up cross domain analytics for my website. I have found lots of sources about the code to use, but haven't been able to find something about what parameters to consider. Would appreciate my fellow mozzers helping me to put together a list of all the variations of elements I may want to track to ensure I am setting it up correctly. Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Keyword Research Tool For Local Customers
Hi all, and thanks in advance for your input. I help mostly small local businesses with SEO and other IM strategy, but am having a hard time finding a good tool for local seo searches. For instance, I have a smaller plumber that covers Denver, but really wants to market to some of the suburbs. What is a good tool to try to find search volume for "littleton plumbers" or similar searches? By the way Littleton is a suburb of Denver. Thanks again. Chris
Intermediate & Advanced SEO | | iFuseInternetMarketing0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Custom Attributes in Google Places
Hi Guys I'm looking for some clarity of what I can and can't add to the custom attribute fields in a Google Places listing. From my understanding, you can add additional information about your services, but not what those services are. The issue I'm trying to resolve is that a client of mine offers far more than the 5 services/ category options Places allow. They are a home services company, covering all sorts from plumbing, painting and decorating, through to extensions etc. They have about 25 different services. At the moment I'm restricted to just getting rankings for 5 services (correlated to the categories in Places), when I'd like to rank locally for them all. As Google is showing local results for most search queries related to their services whether those searches are geographically modified or not, I'm in a position where even if I am ranking top 5 organically for the terms, I'm still on bottom of page 1, or top of page 2. Would it be wise to add these additional services to the custom attributes section of the Places listing, or would this set off the potential for a listing suspension? Any ideas how to combat this problem would be very welcome.
Intermediate & Advanced SEO | | PerchDigital0