Setting up MOZ to run on Staging
-
Hi Moz,
We would like to setup Moz to run on our Staging Server. This would be extremely valuable as it alerts us to new SEO issues/risks in a controlled and secure environment that is not exposed to production.
Our internal team has recommended potentially setting up a reverse proxy server that will validate either via Moz's/Rogerbot's http header or IP and allow Moz access to our Staging environment to crawl.
Is this something that we can setup with Moz? Are there other ideas to enable Moz to crawl our Staging server?
-
I know this is old - but I was also wondering this same thing.
Our specific use case is that we would love to be able to run a on-page grader via Moz on our staging environment before going live to make sure we are not degrading on-page seo for any specific keywords when changing things around to new designs.
-
No worries!
You can add a directive to your site's robots.txt file as such:
User-agent: rogerbot
Disallow:User-agent:*
Disallow: /This will block all bots from accessing your site aside from rogerbot.
There may be other methods as well, which I am personally not aware of. Perhaps other users on the forum may be able to provide tips on what you can potentially try out.
Let me know if you have any other questions!
Eli
-
Hi Eli,
Thanks for the fast response.
Creating a public facing 'staging' instance does create a certain degree of risk as it can potentially be discovered and indexed.
Are other clients currently using this method? If so what best practices do you recommend we follow if we were to go this route to ensure that our 'Staging' site is ONLY crawled by Rogerbot and not by any other bot or service.
-
Hey!
Thanks for reaching out to us!
In order to track a new site you would indeed need to create a new campaign. You can do that here https://analytics.moz.com/campaigns/new . Each campaign tracks the exact URL entered when it's created.
Unfortunately you would not be able to validate us by IP address alone as we do not use a static IP address or range of IP addresses, as we have designed our crawler to have a dynamic approach. This means we use thousands of dynamic IP addresses which will change each time we run a crawl. We believe that this approach gives us the best dynamic view of the web!
However, one alternative I'd be able to suggest would be to identify our crawler by User-agent: rogerbot. You can read more about rogerbot in our guide. It's also worth pointing out that your site would need to be publicly available for our crawler to reach and crawl it.
I hope this helps, please do let me know if you would like more clarification or have any other queries.
You're welcome to reach out to help@moz.com if it's easier!
Eli
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz-Specific 404 Errors Jumped with URLs that don't exist
Hello, I'm going to try and be as specific as possible concerning this weird issue, but I'd rather not say specific info about the site unless you think it's pertinent. So to summarize, we have a website that's owned by a company that is a division of another company. For reference, we'll say that: OURSITE.com is owned by COMPANY1 which is owned by AGENCY1 This morning, we got about 7,000 new errors in MOZ only (these errors are not in Search Console) for URLs with the company name or the agency name at the end of the url. So, let's say one post is: OURSITE.com/the-article/ This morning we have an error in MOZ for URLs OURSITE.com/the-article/COMPANY1 OURSITE.com/the-article/AGENCY1 x 7000+ articles we have created. Every single post ever created is now an error in MOZ because of these two URL additions that seem to come out of nowhere. These URLs are not in our Sitemaps, they are not in Google... They simply don't exist and yet MOZ created an an error with them. Unless they exist and I don't see them. Obviously there's a link to each company and agency site on the site in the about us section, but that's it.
Moz Pro | | CJolicoeur0 -
Is the Moz Spam Score a reliable detection of PBNs?
Hello Moz community I am looking at a couple of websites to guest post on and it feels almost too easy. I am concerned these sites may be part of a PBN. Their spam scores are quite low however (6% - 11%). Is the new Spam Score on Link Explorer (not Open Site Explorer) a reliable tool to vet PBNs? I asked a member of the Moz team but they didn't know. Perhaps another will. Here is the website http://www.dspacecloud.org/ Thank you in advance for your opinions.
Moz Pro | | Avid-Panda0 -
Is the MOZ Community Ranking broken?
I have been busy and away from MOZ for a while, but over the years I have been an active member of the community. I have over 2000 Moz points and usually if you haven't been on in the specified time period you become unranked, however this usually updates in 24 hours of logging back in. I have been active again since Sunday and I am still unranked yet I have gained several more points answering a couple of questions. My question is - have I missed something and the system has changed (not according to the top users table) or is the system broken?
Moz Pro | | Matt-Williamson0 -
Can Moz generate 404 Broken links report in an excel format
Hi all, Hope you are doing good and that all is well at your end. I'm a marketing team member here at Zephyr and I take care of the content side things for the team. I'm reaching out to you this morning in regards to figuring whether any Moz feature can be used for fixing broken links on our website and whether or not it can generate the broken links report which can be exported and saved in a spreadsheet. We are in the middle of cleaning up all the broken links which are currently live on our drupal site and which is why we are in need of a tool that generates the report of all the broken links onto an excel spreadsheet.Once we have the list of broken links set in a spreadsheet, we will work towards fixing those links. Meanwhile, we have also created a 404 error image for our readers who can be directed towards our homepage. Please, let me know if you can help me take care of these below-mentioned requests: 1. Suggest a tool /Moz feature which can generate the site report and list of broken links onto excel/spreadsheet 2. Once I've the list in an excel format, is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages. Truly appreciate your help. Thank you! Is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages.
Moz Pro | | LilianB0 -
Moz is officially demigod-approved!
So, I was reading Percy Jackson & the Olympians: The Battle of the Labyrinth tonight, and found a description of a graffiti tag in the Labyrinth that says "MOZ RULZ". What do you think? Is one of the Mozzers a demigod who has explored and survived the Labyrinth? 🙂 (See attachment for book quote.) F1WmT5r
Moz Pro | | AdamThompson4 -
How long does it take to run a full keyword SERP analysis report?
Pretty much as the title suggests? Not something I'd done before as I normally do it manually and thought I'd give this a whirl.
Moz Pro | | jasonwdexter0 -
SEOMoz Private Message Settings
Hi all, Is there a way to be notified (to my personal email address) when I get sent a private message? Thanks,
Moz Pro | | Unity
Davinia0 -
How can you set SEOmoz to work with your dev site behind an htpasswd?
All sites need to be developed from the small to the grand - and this takes time. Development usually takes place on a subdomain different from our live domain. It is locked down behind an htpasswd during development so its not picked up by searching engines - that may create duplicate content issues if when the site goes live it has already scanned our site on the development server. Its also a security implementation to keep the site away from prying eyes before its ready for launch There could be security holes that have not been tweaked. Whats the best strategy to get SEOmoz involved in this scenario. Its tools are invaluable to the SEO part of the build - but the seomoz crawler bot has a different IP address (being cloud based) - so we cannot just let a single IP address through our htpasswd. Also is there a way to link the dev and live site in seomoz - so when it goes live to maintain all teh same logs without having to create two seperate site campaigns? Thanks!
Moz Pro | | dseo2410