Unsolved Moz can't crawl my site
-
Moz is being blocked from crawling the following site - https://www.cleanchain.com. When looking at Robot.txt, the following is disallowing access but don't know whether this is preventing Moz from crawling too?
User-agent: *
Disallow: /adeci/
Disallow: /core/
Disallow: /connectors/
Disallow: /assets/components/Could something else be preventing the crawl?
-
@danhart2020 doesn't look like your robots.txt should be blocking it. Looks like it's giving roberbot (the user agent MOZ uses) a 403 error. So might be server level, is there anything in your .htaccess file or equivalent depending on server setup.
You can test your robots.txt file here and select the user agent to use:
https://technicalseo.com/ -
@danhart2020 Hello, You can check your .htaccess file too sometime we use the .htaccess for the blocking from the crawling.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Rogerbot blocked by cloudflare and not display full user agent string.
Hi, We're trying to get MOZ to crawl our site, but when we Create Your Campaign we get the error:
Moz Pro | | BB_NPG
Ooops. Our crawlers are unable to access that URL - please check to make sure it is correct. If the issue persists, check out this article for further help. robot.txt is fine and we actually see cloudflare is blocking it with block fight mode. We've added in some rules to allow rogerbot but these seem to be getting ignored. If we use a robot.txt test tool (https://technicalseo.com/tools/robots-txt/) with rogerbot as the user agent this get through fine and we can see our rule has allowed it. When viewing the cloudflare activity log (attached) it seems the Create Your Campaign is trying to crawl the site with the user agent as simply set as rogerbot 1.2 but the robot.txt testing tool uses the full user agent string rogerbot/1.0 (http://moz.com/help/pro/what-is-rogerbot-, rogerbot-crawler+shiny@moz.com) albeit it's version 1.0. So seems as if cloudflare doesn't like the simple user agent. So is it correct the when MOZ is trying to crawl the site it uses the simple string of just rogerbot 1.2 now ? Thanks
Ben Cloudflare activity log, showing differences in user agent strings
2022-07-01_13-05-59.png0 -
Unsolved How do I cancel this crawl?
The latest crawl on my site was the 4th Jan with a current crawl 'in progress'. How do i cancel this crawl and start a new one? I've been getting keyword ranking etc but no new issues are coming through. Screenshot 2022-05-31 083642.jpg
Moz Tools | | ClaireU0 -
Unsolved Halkdiki Properties
Hello,
Moz Pro | | TheoVavdinoudis
I have a question about Site Crawl: Content Issues segment. I have an e-shop and moz showing me problem because my urls are too similar and my H1s are the same
<title>Halkdiki Properties
https://halkidikiproperties.com/en/properties?property_category=1com&property_subcategory=&price_min_range=&price_max_range=&municipality=&area=&sea_distance=&bedroom=&hotel_bedroom=&bathroom=&place=properties&pool=&sq_min=&sq_max=&year_default=&fetures=&sort=1&length=12&ids= <title>Halkdiki Properties
https://halkidikiproperties.com/en/properties?property_category=2&property_subcategory=&price_min_range=0&price_max_range=0&municipality=&area=&sea_distance=&bedroom=0&hotel_bedroom=0&bathroom=0&place=properties&pool=0&sq_min=&sq_max=&year_default=&fetures=&sort=1&length=12&ids= Can someone help, is a big problem or I ignore it?? thank you0 -
Spammy inbound links: Don't Fix It If It's Not Broken?
Hi Moz community, Our website is nearing the end of a big redesign to be mobile-responsive. We decided to delay any major changes to text content so that if we do suffer a rankings drop upon launch, we'll have some ability to isolate the cause. In the meantime I'm analyzing our current SEO strengths and weaknesses. There is a huge discrepancy between our rankings and our inbound link profile. Specifically, we do great on most of our targeted keywords and in fact had a decent surge in recent months. But Link Profiler turned up hundreds of pages of inbound links from spammy domains, many of which don't even display a webpage when I click there. (shown in uploaded image) "Don't fix it if it's not broken" is conflicting with my natural repulsion to these sorts of referrals. Assuming we don't suffer a rankings drop from the redesign, how much of a priority should this be? There are too many and most are too spammy to contact the webmasters, so we'll need to do it through a Disavow. I couldn't even open the one at the top of the list because our business web proxy identified it as adult content. It seems like a common conception is that if Google hasn't penalized us for it yet, they will eventually. Are we talking about the algorithm just stumbling upon these links and hurting us or would this be something we would find in Manual Actions? (or both?) How long after the launch should we wait before attacking these bad links? Is there a certain spam score that you'd say is a threshold for "Yes, definitely get rid of it"? And when we do, should we Disavow domains one domain at a time to monitor any potential drops or all at once? (this seems kind of obvious but if the spam score and domain authority alone is enough of a signal that it won't hurt us, we'd rather get it done asap) How important is this compared to creating fresh new content on all the product pages? Each one will have new images as well as product reviews, but the product descriptions will be the same ones we've had up for years. I have new content written but it's delayed pending any fallout from the redesign. Thanks for any help with this! d1SB2JP.jpg
Moz Pro | | jcorbo0 -
Moz crawling doesn't show all of my Backlinks
Hello, I'm trying to make an SEO backlinks report on my website When using the Link Explorer, I see only a few backlinks while I have much more backlinks on this website. Anyone has an idea about how to fix this issue. How can I check and correct this? My website is www.signsny.com.
Moz Pro | | signsny1 -
Can anyone offer an example of a site or page that gets 100% or even close to that on Search Visibility?
I have a couple of sites that I manage that kill it in the SERPs and yet they get a low search visibility score from Moz Pro, I am talking 18%-19%, and another that ranks well has a search visibility score of 8.71%. I know there are factors that go into calculating the score, I am just curious if anyone is really up there.
Moz Pro | | -b.graves- 00 -
How long is a full crawl?
It's been now over 3 days that the dashboard for one of our campaigns shows "Next Crawl in Progress!". I am not complaining about the length... but I have to agree that SEOMoz is quite addictive, and it's quite frustrating to see that everyday 🙂 Thanks
Moz Pro | | jgenesto0 -
If my keywords aren't driving any traffic to my site, why am I still ranking for them?
In several of our campaigns we have watched our keywords steadily climb the rankings without ever registering so much as a blip in the traffic data column. If these keywords aren't driving any traffic to our site, how are we still ranking for them?
Moz Pro | | MackenzieFogelson0