Does the seomoz crawler that crawls for the onpage reports have a set ip?
-
I would like to test my site, but its not launched yet and don;t want anybody to see it. But I can allow myself and other to view the site if I have there ip address.
So does the seomoz crawler have a static one or range?
James
-
Thanks Istvan.
-
Hi James,
With this question I would contact the Help Desk Team.
You can go to: https://seomoz.zendesk.com/home and submit a ticket or contact them directly via help@seomoz.org
I am sure they will come up with an answer or advice how to do resolve your issue
Good luck,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate home page URL on crawl test
Hi i just recently made a crawl test but before doing that i made sure that i have no more duplicates on my site i am using joomla and as of now i only have 11 links on my site but when my crawl test is done i saw duplicate url of my homepage the duplicate url has a trailing backslash so basically i have all the 11 links + 1 duplicate URL http://mangthomas.com http://mangthomas.com/ can you guys give advise how i can remove the duplicate i dont even know which one to retain. THANKS A LOT, cris
On-Page Optimization | | crisbasma0 -
How to schedule the on page reports myself
The on page reports are scheduled on mondays, but is there a way to schedule it my self.
On-Page Optimization | | JoostBruining0 -
Handling a Huge Amount of Crawl Errors
HI all, I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit. 404 Erorrs: >80'000 Soft 404 Errors: 300 500 Errors: 1600 All of the above reported in GWT. Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing. What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind. So my question is: Generally, what is the appropriate way of handling this? Telling the client that he has to investigate that (I gave my best to at least report the errors) Engaging my firm further and get a developer from my side to investigate? Thanks in advance!!
On-Page Optimization | | spiderz0 -
404 crawl errors with all url+domain
We have 187 crawl 404 errors. All urls on web make a 404 error that this http://www.domain.com/[.....]l/www.domain.com all errors added to the url, the url domain I put an example gestoriabarcelona.com/www.gestoriabarcelona.com
On-Page Optimization | | promonet
gestoriabarcelona.com/tarifas/www.gestoriabarcelona.com
gestoriabarcelona.com/category/noticias/page/7/www.gestoriabarcelona.com
gestoriabarcelona.com/2012/08/amortizacion-de-unaconstruccion/
www.gestoriabarcelona.com
[..] I don't know where can i find to solve errors Anyone can help me? Thanks0 -
Error is not going away and crawling
I have fixed an error but its still showing in red as error. Im totally new to SeoMoz and to SEO in general so im not sure how this tool works. Did I fix it correctly or not if its still showing? It was a broken link and now it links up to another page. Do I just have to wait? My website only has 8 pages and on the dashboard it says crawled 8 pages but it takes up to a week for a full crawl? Im really confused. Thank you in advanced!
On-Page Optimization | | Pixeltistic0 -
Html and css errors - what do SE spiders do if they come across coding errors? Do they stop crawling the rest of the code below the error
I have a client who uses a template to build their websites (no problem with that) when I ran the site through w3c validator it threw up a number of errors, most of which where minor eg missing close tags and I suggested they fix them before I start their off site SEO campaigns. When I spoke to their web designer about the issues I was told that some of the errors where "just how its done" So if that's the case, but the validator still registers the error, do the SE spiders ignore them and move on, or does it penalize the site in some way?
On-Page Optimization | | pab10 -
How To Prevent Crawling Shopping Carts, Wishlists, Login Pages
What's the best way to prevent engines from crawling your websites shopping cart, wishlist, log in pags, ect... Obviously have it in robots.txt but is their any other form of action that should be done?
On-Page Optimization | | Romancing0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0