Search Engine blocked by robots.txt
-
I am getting this error whe I try to crawl http://photosales.belfasttelegraph.co.uk/ but my robots.txt file does not block any bots?
-
Hi Judith,
If you're still having problems, send an email to help@seomoz.org and they'll be able to help you figure out why Roger doesn't want to crawl your site.
-
I am useing the Bing SEO Toolkit, it worked fine for me,
try removeing robots for a test
-
the SEOmoz crawler
-
What are you iusing to crawl?
-
I'm sure why it let you scroll and not me?
-
I just scroled it ok.
but you dont need to put the user agent in twice, this will do below
It was bloking a lot of pages but it crawled the ones you allowedUser-agent: * Disallow: */wo/ Disallow: */ajax/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to track google auto search suggestion click?
Hello Guys, In google.co.uk when I search SEL and google gives me option of different different sites and when I click on any one site then that click tracking I need. I have attached the screenshot to understand easily. Is it possible to track such things or possible via server logs etc? TV99h
Reporting & Analytics | | micey1231 -
404 Status Codes in Google Search Console
Hi all, I've noticed in Google Search Console under 'Crawl errors' - 1. Why does the status code '410' come up as an 'error' in the crawl report? 2. Why are some articles labelled as '404' error when they have been completely deleted and should be a '410' - there are roughly around 1000-2000 of these. Thanks!
Reporting & Analytics | | lucwiesman0 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
Something does not add up with WMTs search analytics data
we recently replatformed our main site and switched to https. For the first 2-3 weeks after we moved organic traffic was great, we did not lose any ( increased a little), but then it dropped off significantly. Attached is a screenshot from one of our main keywords that dropped off. You can see click (blue) and impressions (red) dropped off, and the position became unstable, but in the last week it has stabilised to about the same position it was before, but the clicking and impressions are still very low. The keyword is generic (for our industry) and there would not be any major seasonal changes in the search volume. I can't make sense of this data, could be be wrong? Kd3p5f9.jpg
Reporting & Analytics | | PaddyDisplays1 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
Image Search Keyword Tracking With Google Analytics
Does anyone know how to track keywords for your images in Google, Yahoo, Bing (/imgres)?
Reporting & Analytics | | Melia0 -
Search for signed in users
"As search becomes an increasingly customized experience, particularly for signed in users" What does this imply ? Does it mean that search is more customized for signed in users than those for non signed in users ?
Reporting & Analytics | | seoug_20052 -
How to measure number of visits from Google News coming from Google Universal Search (NOT referral coming directly coming from news.google.com) with google analyitcs
I'm running a news site, and I have a problem of accuratly measuring which traffic is REALLY coming from google news. I analyzed a lot of individual articles and I come to the conclusion, that the visits, that come from the google news section in the universal search results are counted as "normal" search engine traffic in google analytics. So if you do a Google search for a topic that includes links from Google news, you don't get an accurate referral count. As an example, if you do a search for "eBay", incorporated into the page 1 search results you may also see Google news results as well.
Reporting & Analytics | | Mulle
If someone clicks on that Google news link that appears in Google search, it shows up in Google analytics as a referral from Google search, when it was actually from a Google news referral. I was already checking google analytics and google news help forums and searched SEO blogs for this. But I wasn't able to find a working solution. Can anybody help me out with this problem? Thanks so much, Matthias0