Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does anybody know of a good bulk import http response checker? The one I was using has disappeared and I can only find checkers that only take one URL at a time.
-
An example of something I'm looking for but I want to check multiple checkers: http://web-sniffer.net/
-
screaming frog does not seem to export the whole chain though when there is more than one redirect in a chain. Did you experience the same thing, as it may be a glitch on my side?
-
Jeremy - it's available in the free version and if you click on Mode > List you can upload a file. I'm crawling 75K URLs and had to manually increase my memory so it would crawl faster.
Instructions on that here:
http://www.screamingfrog.co.uk/seo-spider/user-guide/general/#5
Good luck!
-
I looked at screaming frog, I didn't see anything that indicated how you would provide a file of thousands of URL's on the same domain to be checked? And I don't want to crawl the website, I just want it to check the 1 url and provide a response then go to the next one. Where would one find this feature in screaming frog?
-
As the thread discusses, a while back we started using Screaming Frog and love it. Recommended.
-
Jeremy/Kurt - I'm in the same boat. Have 75K URLs I'm trying to check as part of a link pruning project.
There's actually another Moz Forum post (http://moz.com/community/q/mass-404-checker) that has a very similar ask and responses are largely the same.
Below is a link that was mentioned in this post, but I've tried it a few times and it keeps getting hung up on any list over 100 URLs.
http://www.tomanthony.co.uk/tools/bulk-http-header-compare/
Below are the only other ones I've found, but again are limited to 100 or less URLs.
http://tools.seobook.com/server-header-checker/?page=bulk&typeProtocol=11&useragent=1
I feel like this is such a common problem for people trying to do any kind of mass back link research project. I'm also looking for something that checks follow vs nofollow so I can further narrow my list.
Maybe we should just create the tool? : )
-
Not sure if we are attempting to do the same thing but in looking at the tools recommended here not sure they do what we are looking for. We would like to take a list of 100,000 URL's and have some software check the response code for each one. We don't want it to crawl a site just the individual URL provided and then go on to the next one in the list.
Anyone know of a tool that can do this?
Thanks,
Jeremy
-
Hi Anthony - we found Screaming Frog a few weeks ago. Definitely a solid tool.
-
Check out Screaming Frog - It's an SEO spider that crawls your site pretty quickly - giving a multitude of useful seo data - including response headers. Two drawbacks: 1) it costs roughly $130 and 2) it has difficulty crawling sites with more than 100K pages. I believe Audette Media is coming out with an industrial strength crawler this summer but don't have a lot details yet. Hope this helps.
-
Hey,
The Crawl Test includes the http status code for each page if that's what you're after. Then you can download it as a .CSV
DD
EDIT - Didn't realise this was PRO but you could always do the free trial
-
so far no luck!
-
Did the SEObook tool in the link below solve your problem, or are you still looking for a tool?
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google Shopping Ads Lower Ranking due to Bounce?
I am noticing Google Shopping Ads are showing up for really irrelevant keywords on some of my products. This quite predictably causes a high bounce rate when a user comes from these ads. There is very little control over what Google Ads seems to decide are relevant keywords from what I can see. Only control is by viewing search terms and setting as negative keywords, but his doesn't help much. Negative keywords are often ignored or they come up with some other really irrelevant new keyword. Seems this high bounce rate could hurt ranking? Any experiences shared with Google Shopping ads appreciated!
Paid Search Marketing | | Chris6611 -
Can a third-party advertising agency lock me out of Adwords?
Hey all, I've just started at a new company. We spend quite a bit on Adwords and I'm tasked with seeing how that is going and assessing that spend. The problem is, Adwords and Youtube ads have been given to a third-party advertising agency. They are only willing to share the number of clicks, cost and conversions, stuff like that. They refuse to give us access to the account. Is this legal? I mostly want to get in there to look at keyword history, see what we have bid on, how often it was searched, stuff like that. But they won't let us in and I'm wondering if they are required to let us look at our account as I would think they are. Please help!
Paid Search Marketing | | DanDeceuster2 -
What is the best way to update Adwords final URLs if I'm moving to a new CMS?
Hi there - One of my clients is redeveloping its website. That means, the domain is remaining the same, but the whole site is being rebuilt in wordpress so all the adwords final URLs need to change OR be redirected. There are 550 live adgroups and 3400 ads. We haven't set up tracking. I can't find anywhere what the best thing to do is in this case. The key issues seem to be: 1. 301 redirects - given we have to do these anyway as part of migration, this seems to be the easiest path as Google is ok about redirects as long as they don't go to a different domain. From what I'm hearing, you don't get adversely impacted in terms of quality score etc. This has the huge advantage that you don't have to edit the ad therefore no loss of statistical history or risk of downtime whilst you wait for approval. HOWEVER, there is some concern that if you then redirected again IN THE FUTURE, the redirect might not work (in some browsers) or cause a loop. I'm also concerned that it's messy to leave it like that (ie: with the wrong URLs throughout). 2. Buik updating ads - I don't think this is an option as if you bulk download and then reupload, Google will see this as a new ad, and delete all the statistical history - I'm also concerned that that WOULD impact quality score as you'd be starting from scratch! 3. Changing each ad individually - as far as I understand you'd have to create copies of all the ads (so that you keep the history of the old ones) and effectively create new ones with the correct URL - one by one. You end up with a messy account (a lot of paused ads) but you keep the history? This is obviously the most time consuming and I can't see a way of avoiding ads having to go in for approval again, given the urls are all different, so you'd have to do this a an ad level, not an adgroup/campaign level etc. People redevelop their websites (without changing domains) all the time. It seems strange that no one is mentioning this problem! Any ideas?! Many thanks
Paid Search Marketing | | catalystmdc0 -
How to track in Google Analytics 2 different subdomains (one for website, the other for PPC landing pages)
Hello Mozers! I have a website with organic visits/goals on www.site.com and a few AdWords Campaign landing pages on lp.site.com whose goals are tracked with both adwords conversion monitoring AND analytics (not imported from analytics into Adword). The landing pages of the campaign have nothing to do with the web site (different cms, they don't link each other, totally isolated) and viceversa. Given that, what would it be the best practice to configure Google Analytics to track the website (www.site.com) AND a PPC campagin (lp.site.com)? I have been told to set up different views of the same property, but do I really need that? Please let me know what are you thinking. Thank you very much. DoMiSoL Rossini
Paid Search Marketing | | DoMiSoL0 -
Google URL Builder / Campaign Tracking on two Different Domain using the Same Analytics Code
Hey Everyone, I think I know the answer to this but I'd like to get some confirmation. I currently have a landing page at "www.xyz.com", it's a separate domain in which only the landing page exists and not a vanity URL which redirects. However, the navigation and all the links on "www.xyz.com" actually link out to "www.abc.com". The domain / landing page "xyz" has the same analytics tracking code as domain "www.abc.com". My question is this, if I use Google URL builder to create custom URL's to track for each ad that I'm running in Adwords, will this data show up in the analytics of "abc" even though it's a separate domain because it has the same analytics code? In other words, does campaign data show only if the domain and the google analytics code line up, or does the domain not matter and as long as you have the same analytics code (despite two separate domains) that campaign data (built through Google URL builder) will show? My hunch and best guess it that as long as the analytics code is the same (regardless of a separate domain) that the data in campaign will show with the custom URL's I build. I'm aware that I can test this and I will but I'd like to get an idea from the community first to make things easier. Anybody have experience with this? Answers greatly appreciated! Thanks!
Paid Search Marketing | | EvansHunt0 -
Near-duplicate content for landing pages - use noindex?
We want to create 5-10 near-duplicates of our homepage to use as landing pages – nearly all same text, but some different images. We want to make sure Google doesn't ding us for duplicate content. Is the best way to do that to tag each of these pages with "noindex"?
Paid Search Marketing | | HopeIndu1 -
Using the same landing page for seo and ppc
When does it make sense to create one landing page for both seo and ppc?
Paid Search Marketing | | melen0 -
Can I dynamically add city name to my PPC ad text and URL based on the user's search?
I have looked into DKI (Dynamic Keyword Insertion), but have not found a solution and thought that some excellent Mozzer might be able to help. Here is the idea: We have landing pages for hundreds of cities. The local content on each of these cities changes page to page, however the keywords that we are going after are the same. So, I am trying to create a dynamic ad group that looks something like this: Headline: {City Name} {Keyword} Description: We cover {City Name} {Keyword}, get more info now! URL: http://www.website.com/{City Name} Please let me know if you can assist with this, B
Paid Search Marketing | | Reis_Inc.0