"Does not respond to web requests" error
-
When trying to set up a new campaign I get the following message:
"Roger has detected a problem: We have detected that the domain www.chicagofinancialadvisers.com does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information."Can someone please tell me what I need to do on my site to make this work? I haven't seen this before and have done many other campaigns. Thanks a lot!
-
Thanks Ryan. That worked!
-
Hello Brien,
I noticed your robots.txt file currently shows as
**User-agent: ***
That is not a properly formatted robots.txt file. You need to add an additional line. Try adjusting as follows:
**User-agent: ***
** Disallow:**
This change MAY resolve the issue. If not, I would suggest checking your server firewall settings to discover if any ports or crawlers are blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Diagnostics - 350 Critical errors? But I used rel-canonical links
Hello Mozzers, We launched a new website on Monday and had our first MOZ crawl on 01/07/15 which came back with 350+ critical errors. The majority of these were for duplicate content. We had a situation like this for each gym class: GLOBAL YOGA CLASS (canonical link / master record) YOGA CLASS BROMLEY YOGA CLASS OXFORD YOGA CLASS GLASGOW etc All of these local Yoga pages had the canonical link deployed. So why is this regarded as an error by MOZ? Should I have added robots NO INDEX instead? Would think help? Very scared our rankings are gonna get effected 😞 Ben
Moz Pro | | Bendall0 -
Web Site Migration Testing and SEO-QA Automation?
Hey Mozzers, Are there any good Migration-SEO-QA Tools out there? Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic SEO factors, old URL vs. new URL, and identify all the specific gaps that need to be fixed? Here is a basic SEO-QA acceptance checklist, for porting any website. . . . Until the porting work is completed we cannot accept the new website. Givens: 1. A list of the Top 100 URLs from the old site, prioritized by conversion rates, landing page traffic, and inbound links. 2. A list of the planned 404 - mapped URLs, old to new site, from the porting team. 3. A list of the current Top 200 Keywords, prioritized. 4. A good amount of SEO work has already been done, by several professionals, for the current (old) site. **How to evaluate if the new site will be acceptable to Google? Check ON-PAGE SEO Factors... ** **. . . that is, the NEW site must be AS GOOD AS (or better than) the current (old) site,
Moz Pro | | George.Fanucci
in the eyes of Google, to preserve the On-Page SEO work already done. ** Criteria: URLs ok? :: Is the URL mapping ok, old to new, best web page? LINKS ok :: Are all internal LINKS and keyword Anchor Text ported? TEXT ok :: On-page content, TEXT and keywords ok? TITLE ok :: HTML Title and title keywords ok? DESCRIPTION ok :: HTML Meta Description ok? H1, H2 ok :: HTML H1, H2 and keywords ok? IMG kwds :: HTML IMG and ALT keywords ok? URL kwds :: URL - keywords in new URLs ok? Potential porting defects: Keywords in URL missing: Keywords in HTML Title missing: Keywords in Meta Description missing: Any internal LINKS or Link anchor text missing: Keywords in Page TEXT missing: H1, H2 missing keywords: HTML IMG alt-text, IMG file URLs, any missing keywords: Notes: Until the porting work is completed we cannot accept the new site, or set a target date for potential cutover. There are eight (8) data items per URL, and about one hundred (100) URLs to be considered for SEO-QA before going live. We were expecting to cutover before the end of February, at the latest. There is no point in doing full QA acceptance-tests until the porting work is completed. QA spot-checks have found far too many defects. About 60% of the landing-page traffic comes via the top 40 URLs. With over 100 URLs to look at, it can take more than a week or two just to do SEO-QA in detail, manually, item-by-item, page-by-page, side-by-side, old vs. new. Spot-checks indicate a business disaster would occur unless the porting defects are fixed before going live. _Any Migration-QA Tools?_Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic On-Page SEO factors, old URL vs. new URL, and identify most of the specific gaps that need to be fixed before going live with the new site? _ *** Edit: Any comments on the SEO criteria, tools, or methods will be appreciated!_0 -
I did a redirect and now I'm getting duplication errors.
I was told by SEO Moz to do a redirect so that our website would be crawled with and without the www in front of the address. I did and now I'm getting duplicate page and title errors because the crawler is seeing www.oursitename.com and its underpages and oursitename.com and its underpages and giving me duplicate page content errors and duplicate page title errors. Makes sense, but how do I make it stop? Anyone else have this problem?
Moz Pro | | THMCC0 -
About how long does it take Google WMT to refresh stats on "Links to Your Site"?
About how long does it take Google WMT to refresh stats on "Links to Your Site"? We're dealing with an unnatural link/anchor phrase issue and I'm curious as to the "typical" time it takes for Google to recognize the links removal or Anchor Text change. Any refresh time ideas on OpenSiteExplorer or AHREFS as well would be a plus... Thanks! Dan Using this guide (very helpful thanks SEOmoz!) http://www.seomoz.org/blog/identifying-link-penalties-in-2012
Moz Pro | | MTteam0 -
High level of 404 client errors
My clients website is an e-commerce based website, where customers can go on and buy products etc from the website. I placed the website onto seomoz and it cam eback with something like 18,000 errors, mostly 404 client errors, when I checked to see what the URL was from, it was a summary of an order to a client who just purchased something from the website, this was the case for alot of the errors. So i am wondering, will this harm the site's optimisation or any other part of it? and how can I get rid of these errors? Many Thanks Charlene
Moz Pro | | Louise990 -
Crawler reporting incorrect URLs, resulting in false errors...
The SEOmoz crawler is showing 236 Duplicate Page Titles. When I go in to see what page titles are duplicated I see that the URLs in question are incorrect and read "/about/about/..." instead of just "/about/" The shown page duplicates are the result of the crawler is ending up on the "Page not found" page. Could it be the result of using relative links on the site? Anything I can do to remedy? Thanks for your help! -Frank
Moz Pro | | Clements1 -
Urgent Feature Request
Please review my request here: http://www.screencast.com/t/4IBtbBFC and let me know I think it would make the program even more robust and beneficial to many members. Please advise: Thanks Sean
Moz Pro | | montage0 -
"no urls with duplicate content to report"
Hi there, i am trying to clean up some duplicate content issues on a website. The crawl diagnostics says that one of the pages has 8 other URLS with the same content. When i click on the number "8" to see the pages with duplicate content, i get to a page that says "no urls with duplicate content to report". Why is this happening? How do i fix it?
Moz Pro | | fourthdimensioninc0