Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Recommended Website Monitoring Tools
-
Hi,
I was wondering what people would recommend for website monitoring (IE is my website working as it should!).
I need something that will:
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities.We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case.
Thanks
-
We use Pingdom to monitor a lot of client websites. It is great, because we receive SMS messages when something is wrong. The detailed reporting, iPhone app and abilty to monitor http-statuses is exceptional!
-
Have not, but based on the service for free, it is likely worth a try given it is more robust. With most of our sites we do not have the level of complexity so it is less of a need. Hopefully, some of the mozzers with more eCommerce will see and respond. Also, if you have a private question available, you might use that to go straight to moz and see what they could suggest.
-
PS - I had a look at Mon.itor.us - have you tried their paid service: http://portal.monitis.com/ ??
-
Hi Rob,
Essentially we have a pretty complex website, with many different sections. This website is constantly being developed so there will probably be code releases for changes maybe 4-5 times per week. Any one of these changes may end up causing an issue with one of the pages (IE page of a specific type) . In addition to this we can get issues with DB or server memory which can occasional cause the website to fail.
All issues are pretty disastrous for business, so what I need to know (or to be more exact our developers need to know) as soon as an issue occurs (most of the attached services will check down all you to set a checking period of say every 5 mins) so it can be fixed (as opposed to waiting for a customer etc to tell us there is a website issue, or manually checking every page type with every code release).
As I say we do have websitepulse at the moment which is great, but also far to complex etc to easy set up and manage, so just doing research around this area, and seeing if anyone has some advice.
Thanks
-
Mon.itor.us works well and is free.
-
It seems you are looking for something that constantly monitors the site and simply alerts you to problems. From my point of view as an agency that has more than a few sites up, it might be overkill and I am not sure of what it would be. What we do to cover what you are listing is this: We have a pro plus moz membership and do campaign tracking with it. We can see on a weekly basis via email and daily if we just log in: 4xx, 5xx errors, dupe pg titles, missing pg titles, blocked bots, etc. as well as on page SEO issues, and general robots, rel - canon, etc.
For content checking of page changes I am at a loss, error reports as above and server downtime as below (mon.itor.us) with good result. The beauty of the SEOmoz campaign for me is that it also tracks rankings, connects to G Analytics, and provides competitive link analysis DA, PA, etc.
For the Headers you can use Screaming Frog (I just love that name and it works).
Hope that helps.
-
Doing some digging I found a useful list:
http://mashable.com/2010/04/09/free-uptime-monitoring/
Anyone have any feedback/reviews on these specific tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
Hi Moz Experts I would like to know what does it the best practice for multilanguage website for the default language ? There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS. Thank you in advance.
Web Design | | johncurlee0 -
Website Home page suddenly disappeared after changing Hosting
HI All, My site was ranking very well and was in 1st page of google for most of my keywords. Last week we did some update to the site and moved it to new hosting and from then onwards I dont see my site home page in Google ranking . My Website Name is : royalevents.com.au. We used to be in 1st of Google for keywords like wedding Mandaps, Indian Wedding Mandaps etc, Would be great if some one helps us to figure out whats gone wrong .. I also did Webmaster Fetch as Google but nothing happened. Thanks
Web Design | | Verve-Innovation0 -
SEO strategy for UK / US websites
Hi, We currently have a UK-focused site on www.palmatin.com ; We're now targeting the North American market as well, but the contents of the site need to be different from UK. One option was to create another domain for the NA market but I assume it would be easier to rank with palmatin.com though. What would you suggest to do, if a company is targeting two different countries in the same language? thanks, jaan
Web Design | | JaanMSonberg0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
E-Commerce Website Architecture - Cannibalization between Product Categories and Blog Categories?
Hi, I have an e-commerce site that sells laptops. My main landing pages and category pages are as follows:
Web Design | | BeytzNet
"Toshiba Laptops", "Samsung Laptops", etc. We also run a WP blog with industry news.
The posts are divided into categories which are basically as our landing pages.
The posts themselves usually link to the appropriate e-commerce landing page.
For example: a post about a new Samsung Laptop which is categorized in the blog under "Samsung Laptops" will naturally link somewhere inside to the "samsung laptops" ecommerce landing page. Is that good or do the categories on the blog cannibalize my more important e-commerce section landing pages? Thanks0 -
Need to rebuild client's flash website
I am working with their web designer and need to figure out a way to rebuild their site which is currently all in flash. I was wondering if there was a way to do this without spending a ton of time in completely re-doing the site from scratch.
Web Design | | awalker840