Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Recommended Website Monitoring Tools
-
Hi,
I was wondering what people would recommend for website monitoring (IE is my website working as it should!).
I need something that will:
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities.We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case.
Thanks
-
We use Pingdom to monitor a lot of client websites. It is great, because we receive SMS messages when something is wrong. The detailed reporting, iPhone app and abilty to monitor http-statuses is exceptional!
-
Have not, but based on the service for free, it is likely worth a try given it is more robust. With most of our sites we do not have the level of complexity so it is less of a need. Hopefully, some of the mozzers with more eCommerce will see and respond. Also, if you have a private question available, you might use that to go straight to moz and see what they could suggest.
-
PS - I had a look at Mon.itor.us - have you tried their paid service: http://portal.monitis.com/ ??
-
Hi Rob,
Essentially we have a pretty complex website, with many different sections. This website is constantly being developed so there will probably be code releases for changes maybe 4-5 times per week. Any one of these changes may end up causing an issue with one of the pages (IE page of a specific type) . In addition to this we can get issues with DB or server memory which can occasional cause the website to fail.
All issues are pretty disastrous for business, so what I need to know (or to be more exact our developers need to know) as soon as an issue occurs (most of the attached services will check down all you to set a checking period of say every 5 mins) so it can be fixed (as opposed to waiting for a customer etc to tell us there is a website issue, or manually checking every page type with every code release).
As I say we do have websitepulse at the moment which is great, but also far to complex etc to easy set up and manage, so just doing research around this area, and seeing if anyone has some advice.
Thanks
-
Mon.itor.us works well and is free.
-
It seems you are looking for something that constantly monitors the site and simply alerts you to problems. From my point of view as an agency that has more than a few sites up, it might be overkill and I am not sure of what it would be. What we do to cover what you are listing is this: We have a pro plus moz membership and do campaign tracking with it. We can see on a weekly basis via email and daily if we just log in: 4xx, 5xx errors, dupe pg titles, missing pg titles, blocked bots, etc. as well as on page SEO issues, and general robots, rel - canon, etc.
For content checking of page changes I am at a loss, error reports as above and server downtime as below (mon.itor.us) with good result. The beauty of the SEOmoz campaign for me is that it also tracks rankings, connects to G Analytics, and provides competitive link analysis DA, PA, etc.
For the Headers you can use Screaming Frog (I just love that name and it works).
Hope that helps.
-
Doing some digging I found a useful list:
http://mashable.com/2010/04/09/free-uptime-monitoring/
Anyone have any feedback/reviews on these specific tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Spanish website indexed in English, redirect to spanish or english version if i do a new website design?
Hi MOZ users, i have this problem. We have a website in Spanish Language but Google crawls it on English (it is not important the reasons). We re made the entire website and now we are planning the move. The new website will have different language versions, english, spanish and portuguese. Somebody tells me that we have to redirect the old urls (crawled on english) to the new english versions, not to the spanish (the real language of the firsts). Example: URL1 Language: Spanish - Crawled on English --> redirect to Language English version. the other option will be redirect to the spanish new version, which the visitor is waiting to find. URL1 Language: Spanish - Crawled on English --> redirect to Language Spanish version. What do you think? Which is the better option?
Web Design | | NachoRetta0 -
Website Redesign - What to do with old 301 URLs?
My current site is on wordpress. We are currently designing a new wordpress site, with the same URLs. Our current approach is to go into the server, delete the current website files and ad the new website files. My current site has old urls which are 301 redirected to current urls. Here is my question. In the current redesign process, do i need to create pages for old the 301 redirected urls so that we do not lose them in the launch of the new site? or is the 301 command currently existing outside of our server so this does not matter? Thank you in advance.
Web Design | | CamiloSC0 -
Multiple websites for different service areas/business functions?
I'm wondering what the implications are for having multiple domains for different service areas of a company? I realize having multiple domains for one company can be troublesome because of the possibility of duplicate content, keyword cannibalization, and linkbuilding to multiple domains. But when the domains are for very different service offerings/unique business functions that each serve their own purpose (and have different positionings), is there a downside to having more than one domain? Any thoughts would be appreciated!
Web Design | | KevinBloom0 -
E-Commerce Website Architecture - Cannibalization between Product Categories and Blog Categories?
Hi, I have an e-commerce site that sells laptops. My main landing pages and category pages are as follows:
Web Design | | BeytzNet
"Toshiba Laptops", "Samsung Laptops", etc. We also run a WP blog with industry news.
The posts are divided into categories which are basically as our landing pages.
The posts themselves usually link to the appropriate e-commerce landing page.
For example: a post about a new Samsung Laptop which is categorized in the blog under "Samsung Laptops" will naturally link somewhere inside to the "samsung laptops" ecommerce landing page. Is that good or do the categories on the blog cannibalize my more important e-commerce section landing pages? Thanks0 -
3 Brands, 3 Services, 3 Different Websites Right?
My client was told that having 1 website for 3 different brands/services is better than having 3 websites. I need your help to prove my value to a new client. This client has worked with Reach Local on PPC for some time and when they first got started the Reach Local Markering Consultant told this cleint that they needed to have one site for better SEO purposes. The client was told that Google ranks websites higher if they have more paid traffic going to them. I've been doing this for long enough to realize this does not help ranking, at least not enough to make a difference. Keep in mind this is for 3 different companies. One company does plumbing, another electrical and the last one does air conditioning. They also have 4 locations but only two locations have mutliple services opperating out of them. I understand these 2 location will not have there own Google+ Local / Places listing. Using the same address for 2 different business and expecting a first page ranking is just not possible. Right now when you visit the clients website you see a logo that rotates with a banner section that follows the logo rotation. First you see the AC Company and then the Plumbing etc. I see this as confusing to the end user and it is more work to get it ranked for SEO. I recommended that we build 3 speerate websites for each service and just list out all the addresses that the company services on the contact page. I would also design inside the footer links to the other services for branding purposes. Please share your thoughts on how you would handle this if you were doing the SEO for your own 3 different business services. I really appreicate any input/insight to this. Thank you so much in advance!!!!
Web Design | | 1SMG0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0