Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Recommended Website Monitoring Tools
-
Hi,
I was wondering what people would recommend for website monitoring (IE is my website working as it should!).
I need something that will:
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities.We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case.
Thanks
-
We use Pingdom to monitor a lot of client websites. It is great, because we receive SMS messages when something is wrong. The detailed reporting, iPhone app and abilty to monitor http-statuses is exceptional!
-
Have not, but based on the service for free, it is likely worth a try given it is more robust. With most of our sites we do not have the level of complexity so it is less of a need. Hopefully, some of the mozzers with more eCommerce will see and respond. Also, if you have a private question available, you might use that to go straight to moz and see what they could suggest.
-
PS - I had a look at Mon.itor.us - have you tried their paid service: http://portal.monitis.com/ ??
-
Hi Rob,
Essentially we have a pretty complex website, with many different sections. This website is constantly being developed so there will probably be code releases for changes maybe 4-5 times per week. Any one of these changes may end up causing an issue with one of the pages (IE page of a specific type) . In addition to this we can get issues with DB or server memory which can occasional cause the website to fail.
All issues are pretty disastrous for business, so what I need to know (or to be more exact our developers need to know) as soon as an issue occurs (most of the attached services will check down all you to set a checking period of say every 5 mins) so it can be fixed (as opposed to waiting for a customer etc to tell us there is a website issue, or manually checking every page type with every code release).
As I say we do have websitepulse at the moment which is great, but also far to complex etc to easy set up and manage, so just doing research around this area, and seeing if anyone has some advice.
Thanks
-
Mon.itor.us works well and is free.
-
It seems you are looking for something that constantly monitors the site and simply alerts you to problems. From my point of view as an agency that has more than a few sites up, it might be overkill and I am not sure of what it would be. What we do to cover what you are listing is this: We have a pro plus moz membership and do campaign tracking with it. We can see on a weekly basis via email and daily if we just log in: 4xx, 5xx errors, dupe pg titles, missing pg titles, blocked bots, etc. as well as on page SEO issues, and general robots, rel - canon, etc.
For content checking of page changes I am at a loss, error reports as above and server downtime as below (mon.itor.us) with good result. The beauty of the SEOmoz campaign for me is that it also tracks rankings, connects to G Analytics, and provides competitive link analysis DA, PA, etc.
For the Headers you can use Screaming Frog (I just love that name and it works).
Hope that helps.
-
Doing some digging I found a useful list:
http://mashable.com/2010/04/09/free-uptime-monitoring/
Anyone have any feedback/reviews on these specific tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Weird Layout on Initial Website Load?
Whenever I open my site from an uncached source, like google incognito, for a split second it displays purple links and a white background while it loads the rest of the content. I've included a screenshot. Is there any way to fix that.? The site is www.kemprugegreen.com. u8P9q
Web Design | | KempRugeLawGroup1 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
SEO strategy for UK / US websites
Hi, We currently have a UK-focused site on www.palmatin.com ; We're now targeting the North American market as well, but the contents of the site need to be different from UK. One option was to create another domain for the NA market but I assume it would be easier to rank with palmatin.com though. What would you suggest to do, if a company is targeting two different countries in the same language? thanks, jaan
Web Design | | JaanMSonberg0 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
How can a Pincode finder website be SEO optimised?
Guys, I wanted to build a simple Pincode finder website for India. The targeted visitors as is obvious will be from India. Alike other Pincode finder websites, the users in this case too will have to key in the location / area of whose pincode he is looking for and they will get Pincode from that very location / area. Other than this, users will also come to this website when they search for something like " <location name="">pincode</location>" on Google (for instance, users will search for something like "Hiranandani Gardens Powai Pincode") Along with data fethced from our sources via Indian postal departments and other data available in public domain, we shall be using data from Google Maps API too. My question in regards to the same is as follows: What should the page-structure / structure of the website be for ranking well on Google? What should be the URL structure? Other suggestions to rank well on Google in this regards? Competition: (You can search for the term "Hiranandani Gardens Powai Pincode" to know how these sites show data) http://www.getpincode.info http://www.pincode.net.in Pls. help...
Web Design | | ShalinTJ0 -
Need to rebuild client's flash website
I am working with their web designer and need to figure out a way to rebuild their site which is currently all in flash. I was wondering if there was a way to do this without spending a ton of time in completely re-doing the site from scratch.
Web Design | | awalker840 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0