We are using Hotlink Protection on our server for jpg mostly. What is moz.com address to allow crawl access?
-
We are using Hotlink Protection on our server for jpg mostly. What is moz.com crawl url address so we can allow it in the list of allowed domains? The reason is that the crawl statistics gives our a ton of 403 Forbidden errors.
Thanks.
-
Hi there!
Thanks for reaching out to us! I can certainly understand your need to have our crawler be accepted into your link protection program. Unfortunately our crawler doesn't operate using a url to crawl your site, we use a collections of IP addresses behind the scenes to mimic a search engine crawler to provide the best diagnoses around. With that said, a lot of our customer has had some success allowing our crawler on their server level through either their HTTP access file or other methods. Unfortunately I am not a Web developer or a server admin, so I couldn't exactly let you know how to implement it. I would recommend you to perhaps change up your question and post a new question regarding some work around for your software.
Thanks for your time, I hope that helps.
Peter
Moz Help Team.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we use Cloudflare
Hi all, we want to speed up our website (hosted in Wordpress, traffic around 450,000 page views monthly), we use lots of images. And we're wondering about setting up on Cloudflare, however after searching a bit in Google I have seen some people say the change in IP, or possible sharing of Its with bad neighbourhoods, can really hit search rankings. So, I was wondering what the latest thinking is on this subject, would the increased speed and local server locations be a boost for SEO, moreso than a potential loss of rankings for changing IP? Thanks!
Technical SEO | | tiromedia1 -
What do you use to come up with content ideas?
Buzzsumo charge and not sure it's worth it. didn't find Quora helpful. Any others?
Technical SEO | | SwanseaMedicine2 -
Anyone have experience with GrowTeam or the platform they use?
We are being pitched by GrowTeam to improve our keyword rankings. They tell us they work with an SEO Technology company that does A/B testing of title tags on an engine that mimics Google's algorithms. Is this possible? I am not familiar with any platform where someone could do A/B testing on meta titles.
Technical SEO | | MikeAA0 -
Using a single sitemap for multiple domains
We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets. Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.
Technical SEO | | jon_marine0 -
To avoid errors in our Moz crawl, we removed subdomains from our host. (First we tried 301 redirects, also listed as errors.) Now we have backlinks all over the web that are broken. How bad is this, from a pagerank standpoint?
Our MOZ crawl kept telling us we had duplicate page content even though our subdomains were redirected to our main site. (Pages from Wineracks.vigilantinc.com were 301 redirected to vigilantinc.com/wineracks.) Now, to solve that problem, we have removed the wineracks.vigilantinc.com subdomain. The error report is better, but now we have broken backlinks - thousands of them. Is this hurting us worse than the duplicate content problem?
Technical SEO | | KristyFord0 -
Subdomain hosted on a different server VS Subfolder on main server
We have a website developed in ColdFusion on a server does not support PHP. We have a blog for the site using WordPress (PHP), hosted on a different server, with a subdomain as the URL. (example: blog.website.com) I've heard that search engines treat subdomains as completely different websites from the main domain, so they could actually be in competition for rankings in the search engines - is that correct? I am under the impression that the traffic to the blog will not show as traffic to the main website, because it is hosted on a different server - is that right? If I am correct, I assume the best solution would be to install PHP on our main server, and put the blog in a subfolder ... or would the subdomain be OK as long as the blog is hosted on the main server? Thanks!
Technical SEO | | vermont0 -
Server Connectivity
Hey there When we go to our webmaster tools there is a orange triangle. The issue is that Google's robot can not access our site. Does anyone know why this could be? Thanks!
Technical SEO | | Comunicare0 -
How to handle Not found Crawl errors?
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow. http://www.vistastores.com/404 But, I have question about it. Will it require to set 301 redirect for broken links which found in Google webmaster tools?
Technical SEO | | CommercePundit0